CN113619607A - Control method and control system for automobile running - Google Patents

Control method and control system for automobile running Download PDF

Info

Publication number
CN113619607A
CN113619607A CN202111092603.5A CN202111092603A CN113619607A CN 113619607 A CN113619607 A CN 113619607A CN 202111092603 A CN202111092603 A CN 202111092603A CN 113619607 A CN113619607 A CN 113619607A
Authority
CN
China
Prior art keywords
information
driving
control
driving environment
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111092603.5A
Other languages
Chinese (zh)
Other versions
CN113619607B (en
Inventor
何正峰
赵培
高博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hozon New Energy Automobile Co Ltd
Original Assignee
Hozon New Energy Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hozon New Energy Automobile Co Ltd filed Critical Hozon New Energy Automobile Co Ltd
Priority to CN202111092603.5A priority Critical patent/CN113619607B/en
Publication of CN113619607A publication Critical patent/CN113619607A/en
Application granted granted Critical
Publication of CN113619607B publication Critical patent/CN113619607B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a control method and a control system for automobile driving, which comprise the following steps: acquiring running environment information, wherein the running environment information comprises traffic participant information, running track information, cloud platform information and vehicle state information; constructing a driving environment scene according to the driving environment information, and presenting the driving environment scene in artificial intelligent display equipment; receiving and identifying a control instruction sent according to a driving environment scene; and controlling the automobile to run according to the control instruction. The control method and the control system for automobile driving can present driving related information to a user, so that the user can intelligently control the automobile driving according to the received information.

Description

Control method and control system for automobile running
Technical Field
The invention mainly relates to the field of automobile control, in particular to a control method and a control system for automobile driving.
Background
With the development of the automobile industry technology, the intelligent driving technology becomes mature day by day. The L5-level completely unmanned system carrying the camera, the millimeter wave radar, the ultrasonic radar, the laser radar, the high-precision map data and the V2X and 5G technologies is a technical plateau to be reached by future automobiles, and mass production projects can be gradually completed to land along with the accumulation of technical experiences. At that point, the car will no longer have a cockpit, but will instead be a full passenger cabin. The driver can get rid of the heavy and complicated driving task and transfer the driving work to the unmanned system to complete. However, in consideration of the current stage of development of automatic driving, automatic driving of a vehicle is mainly considered, an effective driving participation interface cannot be provided, and actual control of vehicle driving by real-time participation of a driver in a vehicle driving process in an automatic driving mode is often ignored.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a control method and a control system for automobile driving, which can present driving related information to a user, so that the user can intelligently control the automobile driving according to the received information.
In order to solve the technical problem, the invention provides a method for controlling the running of an automobile, which comprises the following steps: acquiring running environment information, wherein the running environment information comprises traffic participant information, running track information, cloud platform information and vehicle state information; constructing a driving environment scene according to the driving environment information, and presenting the driving environment scene in artificial intelligent display equipment; receiving and identifying a control instruction sent out according to the driving environment scene; and controlling the automobile to run according to the control instruction.
In an embodiment of the invention, the traffic participant information includes pedestrian information and other vehicle information, the travel track information includes a traffic path and a road surrounding environment in the whole travel process, the cloud platform information includes weather information, and the vehicle state information includes vehicle basic parameter information.
In an embodiment of the present invention, constructing a driving environment scene according to the driving environment information includes constructing the driving environment scene in a plurality of layers, where each layer includes a part of the driving environment information.
In an embodiment of the present invention, the number of the plurality of layers is 5, and the plurality of layers are respectively a first layer, a second layer, a third layer, a fourth layer, and a fifth layer, where the first layer includes information that attributes in the driving environment information are static, the second layer includes the driving track information, the third layer includes the traffic participant information, the fourth layer includes the vehicle state information, and the fifth layer includes the cloud platform information.
In an embodiment of the invention, before the running environment scene is constructed according to the running environment information, the method further comprises the step of fusing the traffic participant information, the running track information, the cloud platform information and the vehicle state information through a mahalanobis distance theory, a Hungary matching algorithm and/or a Kalman filtering solution mode.
In an embodiment of the invention, the control instructions comprise speech, limb movements and/or brain wave data, and the method comprises identifying the control instructions using a trained intent recognition model.
In an embodiment of the invention, before the control instruction sent according to the driving environment scene is acquired, intention recognition training is further performed, including training the voice, the limb movement and/or the brain wave data based on deep learning in a non-driving state, and obtaining respective training parameters to construct the intention recognition model.
In an embodiment of the invention, when the control instruction is a mixed control instruction of multiple types of voice, limb movement and electroencephalogram data, before controlling the automobile to run according to the control instruction, the method further comprises the step of fusing the training parameters according to a proportional-integral-derivative control and model algorithm control mode.
In an embodiment of the invention, the artificial intelligence display device comprises a virtual reality display device.
In order to solve the above technical problem, the present invention further provides a control system for driving an automobile, including: the data acquisition module is configured to acquire driving environment information, wherein the driving environment information comprises traffic participant information, driving track information, cloud platform information and vehicle state information; the environment scene construction module is configured to construct a driving environment scene according to the driving environment information and present the driving environment scene in the artificial intelligent display device; the instruction receiving module is configured to receive and identify a control instruction sent according to the driving environment scene; and the instruction control module is configured to control the automobile to run according to the control instruction.
In an embodiment of the present invention, the environment scene constructing module is further configured to construct the driving environment scene by dividing into a plurality of layers, where each layer includes a part of the driving environment information.
In another aspect of the present invention, a system for reading data from a bus of a vehicle is provided, including: a memory for storing instructions executable by the processor; and the processor is used for executing the instructions to realize the control method for driving the automobile.
Another aspect of the present invention also provides a computer readable medium storing computer program code, which when executed by a processor implements the above-described control method for vehicle driving.
Compared with the prior art, the invention has the following advantages: the control method and the control system for automobile driving provided by the invention combine with AI technology to provide an intelligent control scheme for automobile driving. The driving environment scene is constructed by adopting multiple layers after the driving environment information is acquired, the scheme based on the invention can combine with AI display equipment to intelligently present driving related information to a user, and meanwhile, the user intelligently controls the driving of the automobile according to the received information by combining with AI technical means through control modes such as voice and the like.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the principle of the invention. In the drawings:
fig. 1 is a schematic flow chart of a method for controlling the driving of a vehicle according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a driving environment scene constructed by multiple layers in a control method for driving an automobile according to an embodiment of the present invention;
FIG. 3 is a system block diagram of a control system for driving a vehicle according to an embodiment of the present invention;
fig. 4 is a system block diagram of a control system for driving a vehicle according to another embodiment of the present invention;
FIG. 5 is a schematic diagram of a method and system for controlling vehicle operation according to an embodiment of the present invention; and
fig. 6 is a schematic diagram of intention recognition training in a control method for vehicle driving according to an embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In the description of the present application, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are used for convenience of description and simplicity of description only, and in the case of not making a reverse description, these directional terms do not indicate and imply that the device or element being referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the scope of the present application; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
It should be noted that the terms "first", "second", and the like are used to define the components, and are only used for convenience of distinguishing the corresponding components, and the terms have no special meanings unless otherwise stated, and therefore, the scope of protection of the present application is not to be construed as being limited. Further, although the terms used in the present application are selected from publicly known and used terms, some of the terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Further, it is required that the present application is understood not only by the actual terms used but also by the meaning of each term lying within.
It will be understood that when an element is referred to as being "on," "connected to," "coupled to" or "contacting" another element, it can be directly on, connected or coupled to, or contacting the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly on," "directly connected to," "directly coupled to" or "directly contacting" another element, there are no intervening elements present. Similarly, when a first component is said to be "in electrical contact with" or "electrically coupled to" a second component, there is an electrical path between the first component and the second component that allows current to flow. The electrical path may include capacitors, coupled inductors, and/or other components that allow current to flow even without direct contact between the conductive components.
An embodiment of the present invention provides a method for controlling driving of an automobile, which can present driving-related information to a user, so that the user can intelligently control driving of the automobile according to the received information.
As shown in fig. 1, a flow chart of a method 10 for controlling the driving of a vehicle according to the present invention is schematically shown. FIG. 1 uses a flowchart in this application to illustrate the operations performed by a system according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
According to fig. 1, a method 10 for controlling the driving of a motor vehicle comprises the following steps.
First, step 11 is to obtain driving environment information, where the driving environment information includes traffic participant information, driving track information, cloud platform information, and vehicle state information.
Specifically, in some embodiments of the present invention, the traffic participant information includes pedestrian information and other vehicle information, the travel track information includes a traffic path and a road surrounding environment in the whole travel process, the cloud platform information includes weather information, and the vehicle state information includes vehicle basic parameter information.
The acquired driving environment information in the step 11 is triggered from the necessity of information based on the driving of the automobile, and complete information related to the driving of the automobile is acquired in multiple angles and multiple dimensions, so that a comprehensive and reliable basis is provided for a driver to better control the driving of the automobile in the later step.
Further referring to fig. 1, step 12 is to construct a driving environment scene according to the driving environment information and present the driving environment scene in the artificial intelligence display device.
Preferably, in some embodiments of the present invention, constructing the driving environment scene according to the driving environment information in step 12 includes constructing the driving environment scene in a plurality of layers, and each layer includes a part of the driving environment information obtained in step 11.
Illustratively, as shown in fig. 2, a schematic diagram of a driving environment scene is constructed by dividing step 12 into a plurality of layers according to an embodiment of the present invention. As shown in fig. 2, the number of the plurality of layers is 5, which are respectively a first layer 21, a second layer 22, a third layer 23, a fourth layer 24, and a fifth layer 25.
Specifically, corresponding to the driving environment information obtained in step 11, in each layer, the first layer 21 includes information with a static attribute in the driving environment information, the second layer 22 includes driving track information, the third layer 23 includes traffic participant information, the fourth layer 24 includes vehicle state information, and the fifth layer 25 includes cloud platform information.
It is understood that the present invention is not limited thereto, for example, in other embodiments of the present invention, the number of layers and the information included in each layer may differ from those in fig. 2 and the above description.
On the other hand, in some embodiments of the present invention, before the driving environment scene is constructed according to the driving environment information, data fusion is further performed on the traffic participant information, the driving track information, the cloud platform information and the vehicle state information, and finally the data fusion is packaged into uniform analyzable data as an input of the following step.
It is noted that in some embodiments of the present invention, the artificial intelligence display device used to render the driving environment scene in step 12 comprises a virtual reality display device. Thus, the driver can feel the surrounding driving environment in the vehicle form process in an immersive manner, and preparation is made for intelligently controlling the driving of the vehicle in the next step.
Further, according to fig. 1, step 13 is to receive and recognize a control command issued according to the driving environment scene.
Illustratively, in some embodiments of the invention, the control directives involved in step 13 include speech, limb movement, and/or brain wave data, and the control directives are recognized using a trained intent recognition model.
In addition, in some embodiments of the present invention, before the step 13 is executed to obtain the control instruction issued according to the driving environment scene, performing intent recognition training, including training the voice, the limb movement, and/or the brain wave data based on deep learning in a non-driving state, obtaining respective training parameters, so as to construct an intent recognition model, and further recognizing the data including the voice, the limb movement, and/or the brain wave data according to the intent recognition model.
Finally, according to fig. 1, step 14 is to control the vehicle to run according to the control command.
It will be appreciated that the control commands received and identified in step 13 may be of a single type or a mixed type. Preferably, when the control command is a mixed control command of multiple types of voice, limb movement and brain wave data, before the step 14 of executing the operation of controlling the automobile to run according to the control command, the method further comprises the step of fusing the training parameters according to a proportional-integral-derivative control mode and a model algorithm control mode.
The above-mentioned control method 10 for vehicle driving combines with AI technology to provide an intelligent control scheme for vehicle driving. After the driving environment information is obtained, a driving environment scene is constructed by adopting multiple layers, and driving related information is intelligently presented to a user through AI display equipment such as virtual reality display equipment. Furthermore, the user intelligently controls the driving of the automobile according to the received information by combining the control modes such as voice and the like with AI technical means such as a training parameter model and the like.
In another aspect of the present invention, a system for controlling driving of a vehicle is also provided, which can present driving-related information to a user, so that the user can intelligently control driving of the vehicle according to the received information.
Fig. 3 is a system block diagram of a control system 30 for driving a vehicle according to an embodiment of the present invention. The control system 30 includes a data acquisition module 31, an environmental scene construction module 32, an instruction receiving module 33, and an instruction control module 34. The data acquiring module 31 is configured to acquire driving environment information, where the driving environment information includes traffic participant information, driving track information, cloud platform information, and vehicle state information. The environmental scene construction module 32 is configured to construct a driving environment scene from the driving environment information and present the driving environment scene in the artificial intelligence display device. The instruction receiving module 33 is configured to receive and recognize a control instruction issued according to a traveling environment scene. The command control module 34 is configured to control the vehicle to run according to the control command.
In particular, in some embodiments of the present invention, the environment scene construction module 32 is further configured to construct the driving environment scene in a plurality of layers, each layer including a portion of the driving environment information acquired by the data acquisition module.
For other details of the control system for vehicle driving according to the present invention, reference may be made to the above-mentioned details of the control method for vehicle driving according to the present invention shown in fig. 1-2, which are not repeated herein.
Another aspect of the present invention provides a control system for driving an automobile, including: a memory for storing instructions executable by the processor; and the processor is used for executing instructions to realize the control method for driving the automobile.
Illustratively, as shown in FIG. 4, is a system block diagram of a vehicle travel control system 40 having a memory and a processor. The vehicle travel control system 40 may include an internal communication bus 41, a Processor (Processor)42, Read Only Memory (ROM)43, Random Access Memory (RAM)44, and a communication port 45. When implemented on a personal computer, the vehicle operating control system 40 may also include a hard disk 46.
The internal communication bus 41 may enable data communication among the components of the control system 40 for vehicle travel. Processor 42 may make the determination and issue a prompt. In some embodiments, processor 42 may be comprised of one or more processors. The communication port 45 can realize data communication between the control system 40 for driving the automobile and the outside. In some embodiments, the control system 40 for vehicle operation may send and receive information and data from the network via the communication port 45.
The vehicle travel control system 40 may also include various forms of program storage units and data storage units, such as a hard disk 46, Read Only Memory (ROM)43 and Random Access Memory (RAM)44, capable of storing various data files for computer processing and/or communication, as well as possibly program instructions for execution by the processor 42. The processor executes these instructions to implement the main parts of the method. The results processed by the processor are communicated to the user device through the communication port and displayed on the user interface.
Finally, another aspect of the present invention also provides a computer readable medium storing computer program code, which when executed by a processor implements the above-mentioned method for controlling the driving of a vehicle.
The control method and the system for automobile driving can present driving related information to a user through an intelligent means in the driving process, so that the user can intelligently control the automobile driving according to the received information. For better illustration of the control method and system, fig. 5 and 6 are schematic diagrams of the technical solution of the present invention.
Firstly, according to fig. 5, as a whole, the driving environment information acquired by the other information platforms is used as data input to construct a driving environment scene, and then is projected in the artificial intelligence display device through the scene. Specifically, the driving environment information includes traffic participant information, driving track information (map positioning, etc.), cloud platform information, vehicle state information, and the like. Particularly, the traffic participant information can be obtained according to various channels such as base station data or information obtained by a vehicle body sensor, and the like, and the fusion of the traffic participant information is finally completed to support the construction of a driving environment scene. On the other hand, the vehicle state information comprises vehicle dynamic information and vehicle perception information, wherein the vehicle dynamic information can assist in finishing high-precision positioning of the vehicle, and the vehicle perception information can directly help in building a driving environment scene. The special way is that the running environment information finally realizes the construction of a running environment scene, and the information required to be acquired in the driving process is presented to the client in an all-round way. On the basis, the user can send out a control command according to the actual driving condition and the requirement through the information, so that the vehicle is controlled to run.
More specifically, in the control method of the running of the automobile of the present invention, a schematic diagram of the intention recognition is shown in fig. 6. According to fig. 6, the process of intention recognition can be divided into two parts, namely off-line training and on-line recognition, and voice recognition training, limb recognition training and consciousness recognition training are respectively performed by inputting voice, voice tags, limbs, limb tags, intention and intention tags, and finally the training results are fused and loaded into the automobile system. Therefore, in the subsequent use process, a user can control the vehicle in the modes of voice, gestures or brain wave input and the like, and an intelligent control link in the driving process of the vehicle is completed.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. The processor may be one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), digital signal processing devices (DAPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, or a combination thereof. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media. For example, computer-readable media may include, but are not limited to, magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips … …), optical disks (e.g., Compact Disk (CD), Digital Versatile Disk (DVD) … …), smart cards, and flash memory devices (e.g., card, stick, key drive … …).
The computer readable medium may comprise a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. The computer readable medium can be any computer readable medium that can communicate, propagate, or transport the program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, radio frequency signals, or the like, or any combination of the preceding.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
Although the present application has been described with reference to the present specific embodiments, it will be recognized by those skilled in the art that the foregoing embodiments are merely illustrative of the present application and that various changes and substitutions of equivalents may be made without departing from the spirit of the application, and therefore, it is intended that all changes and modifications to the above-described embodiments that come within the spirit of the application fall within the scope of the claims of the application.

Claims (13)

1. A method for controlling the driving of an automobile, comprising the steps of:
acquiring running environment information, wherein the running environment information comprises traffic participant information, running track information, cloud platform information and vehicle state information;
constructing a driving environment scene according to the driving environment information, and presenting the driving environment scene in artificial intelligent display equipment;
receiving and identifying a control instruction sent out according to the driving environment scene; and
and controlling the automobile to run according to the control instruction.
2. The control method according to claim 1, wherein the traffic participant information includes pedestrian information and other vehicle information, the travel track information includes a traffic path and a road surrounding environment throughout travel, the cloud platform information includes weather information, and the vehicle own state information includes vehicle basic parameter information.
3. The control method according to claim 1, wherein constructing a travel environment scene based on the travel environment information includes constructing the travel environment scene in a plurality of layers, each layer containing a part of the travel environment information.
4. The control method according to claim 3, wherein the number of the plurality of layers is 5, and is a first layer, a second layer, a third layer, a fourth layer, and a fifth layer, respectively, where the first layer includes information that attributes of the driving environment information are static, the second layer includes the driving trajectory information, the third layer includes the traffic participant information, the fourth layer includes the vehicle state information, and the fifth layer includes the cloud platform information.
5. The control method according to claim 1, wherein before constructing the driving environment scene according to the driving environment information, the method further comprises fusing the traffic participant information, the driving track information, the cloud platform information and the vehicle state information by using a mahalanobis distance theory, a hungarian matching algorithm and/or a kalman filter solution.
6. The control method of claim 1, wherein the control instructions comprise voice, limb motion, and/or brain wave data, the method comprising recognizing the control instructions using a trained intent recognition model.
7. The control method according to claim 6, further comprising performing intention recognition training including training the voice, limb movements and/or brain wave data based on deep learning to obtain respective training parameters to construct the intention recognition model in a non-driving state, before acquiring the control instruction issued according to the driving environment scene.
8. The control method according to claim 7, wherein when the control command is a mixed control command of a plurality of types of voice, body motion and brain wave data, before controlling the vehicle to run according to the control command, the method further comprises fusing the training parameters according to a proportional-integral-derivative control and model algorithm control mode.
9. The control method of claim 1, wherein the artificial intelligence display device comprises a virtual reality display device.
10. A control system for vehicle driving, comprising:
the data acquisition module is configured to acquire driving environment information, wherein the driving environment information comprises traffic participant information, driving track information, cloud platform information and vehicle state information;
the environment scene construction module is configured to construct a driving environment scene according to the driving environment information and present the driving environment scene in the artificial intelligent display device;
the instruction receiving module is configured to receive and identify a control instruction sent according to the driving environment scene; and
and the instruction control module is configured to control the automobile to run according to the control instruction.
11. The control system of claim 10, wherein the environment scene construction module is further configured to construct the driving environment scene in a plurality of layers, each layer containing a portion of the driving environment information.
12. A control system for running of an automobile, comprising:
a memory for storing instructions executable by the processor; and a processor for executing the instructions to implement the method of any one of claims 1-9.
13. A computer-readable medium having stored thereon computer program code which, when executed by a processor, implements the method of any of claims 1-9.
CN202111092603.5A 2021-09-17 2021-09-17 Control method and control system for automobile running Active CN113619607B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111092603.5A CN113619607B (en) 2021-09-17 2021-09-17 Control method and control system for automobile running

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111092603.5A CN113619607B (en) 2021-09-17 2021-09-17 Control method and control system for automobile running

Publications (2)

Publication Number Publication Date
CN113619607A true CN113619607A (en) 2021-11-09
CN113619607B CN113619607B (en) 2023-04-18

Family

ID=78390282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111092603.5A Active CN113619607B (en) 2021-09-17 2021-09-17 Control method and control system for automobile running

Country Status (1)

Country Link
CN (1) CN113619607B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106560765A (en) * 2016-06-14 2017-04-12 深圳创达云睿智能科技有限公司 Method and device for content interaction in virtual reality
US20180005526A1 (en) * 2016-06-30 2018-01-04 Honda Research Institute Europe Gmbh Method and system for assisting a driver in driving a vehicle and vehicle on which such system is mounted
CN208938350U (en) * 2018-11-21 2019-06-04 长安大学 A kind of vehicle-mounted HMI display system under adaptive cruise mode
CN110758243A (en) * 2019-10-31 2020-02-07 的卢技术有限公司 Method and system for displaying surrounding environment in vehicle driving process
CN112740134A (en) * 2018-09-21 2021-04-30 三星电子株式会社 Electronic device, vehicle control method of electronic device, server, and method of providing accurate map data of server
CN113085900A (en) * 2021-04-29 2021-07-09 的卢技术有限公司 Method for calling vehicle to travel to user position

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106560765A (en) * 2016-06-14 2017-04-12 深圳创达云睿智能科技有限公司 Method and device for content interaction in virtual reality
US20180005526A1 (en) * 2016-06-30 2018-01-04 Honda Research Institute Europe Gmbh Method and system for assisting a driver in driving a vehicle and vehicle on which such system is mounted
CN112740134A (en) * 2018-09-21 2021-04-30 三星电子株式会社 Electronic device, vehicle control method of electronic device, server, and method of providing accurate map data of server
CN208938350U (en) * 2018-11-21 2019-06-04 长安大学 A kind of vehicle-mounted HMI display system under adaptive cruise mode
CN110758243A (en) * 2019-10-31 2020-02-07 的卢技术有限公司 Method and system for displaying surrounding environment in vehicle driving process
CN113085900A (en) * 2021-04-29 2021-07-09 的卢技术有限公司 Method for calling vehicle to travel to user position

Also Published As

Publication number Publication date
CN113619607B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
DE102018121595B4 (en) UNSUPERVISED TRAINING OF AGENTS FOR AUTONOMOUS DRIVING APPLICATIONS
Wang et al. Visual human–computer interactions for intelligent vehicles and intelligent transportation systems: The state of the art and future directions
Ohn-Bar et al. Looking at humans in the age of self-driving and highly automated vehicles
DE102018127361A1 (en) SYSTEMS AND METHOD FOR AUTONOMOUS MOTOR VEHICLES WITH BEHAVIOR CONTROL
DE102017111843A1 (en) Systems to dynamically guide a user to a pickup location of an autonomous vehicle by means of extended reality walking instructions
DE102019102205A1 (en) SYSTEM AND METHOD FOR THE END TO END VALIDATION OF AUTONOMOUS VEHICLES
DE102021126648A1 (en) IMITATION TRAINING USING SYNTHETIC DATA
CN109300324A (en) A kind of environment information acquisition method and device of pilotless automobile
CN110371132A (en) Driver's adapter tube appraisal procedure and device
Yuan et al. Multi-reward architecture based reinforcement learning for highway driving policies
CN112784867A (en) Training deep neural networks using synthetic images
CN112363511A (en) Vehicle path planning method and device, vehicle-mounted device and storage medium
US11194939B2 (en) Hardware in loop testing and generation of latency profiles for use in simulation
US20220011112A1 (en) Vehicle travel control device
DE102022111322A1 (en) EYE TRACKING ADAPTIVE MACHINE LEARNING MODEL ENGINE
CN113619607B (en) Control method and control system for automobile running
DE102021116309A1 (en) ASSISTANCE FOR DISABLED DRIVERS
DE102022123257A1 (en) Selective elimination of the counteracting robustness features of neural networks
CN114120653A (en) Centralized vehicle group decision control method and device and electronic equipment
CN112668692A (en) Quantifying realism of analog data using GAN
CN108860150A (en) Automobile brake method, apparatus, equipment and computer readable storage medium
Priya et al. Intelligent navigation system for emergency vehicles
CN109532724A (en) A kind of vehicle-mounted automatic Pilot control circuit
US20240161028A1 (en) Orchestrating vehicles and remote operating clients
US20240160203A1 (en) Orchestrating vehicles and remote operating clients

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 314500 988 Tong Tong Road, Wu Tong Street, Tongxiang, Jiaxing, Zhejiang

Applicant after: United New Energy Automobile Co.,Ltd.

Address before: 314500 988 Tong Tong Road, Wu Tong Street, Tongxiang, Jiaxing, Zhejiang

Applicant before: Hozon New Energy Automobile Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant