CN114546383A - Driving scene display method and device, electronic equipment and storage medium - Google Patents

Driving scene display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114546383A
CN114546383A CN202210160003.6A CN202210160003A CN114546383A CN 114546383 A CN114546383 A CN 114546383A CN 202210160003 A CN202210160003 A CN 202210160003A CN 114546383 A CN114546383 A CN 114546383A
Authority
CN
China
Prior art keywords
scene
driving
target
scenario
logic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210160003.6A
Other languages
Chinese (zh)
Inventor
贾新达
张波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Zejing Automobile Electronic Co ltd
Original Assignee
Jiangsu Zejing Automobile Electronic Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Zejing Automobile Electronic Co ltd filed Critical Jiangsu Zejing Automobile Electronic Co ltd
Priority to CN202210160003.6A priority Critical patent/CN114546383A/en
Publication of CN114546383A publication Critical patent/CN114546383A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/31Programming languages or programming paradigms
    • G06F8/313Logic programming, e.g. PROLOG programming language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/33Intelligent editors

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The embodiment of the application discloses a driving scene display method and device, electronic equipment and a storage medium, and relates to the technical field of computers. Wherein, the method comprises the following steps: judging whether the actual value of the target scene trigger parameter of the vehicle changes or not; if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene triggering parameters to obtain an operation result, and determining the target driving scene according to the operation result; and sending the scene information of the target driving scene to the scene display component so as to display the scene information of the target driving scene by using the scene display component. According to the technical scheme provided by the embodiment of the application, flexible modification of scene logic can be realized, dynamic configuration and self-definition of a driving scene can be realized, and meanwhile, the time for a designer to verify the scene is greatly reduced.

Description

Driving scene display method and device, electronic equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a driving scene display method and device, electronic equipment and a storage medium.
Background
At present, the driving situation of an automobile is complex and changeable, and in order to display more abundant driving information on a User Interface (UI) of the automobile, different situations in the driving of the automobile are described by using a driving scene. At present, conventional automobile UI display software is generally developed by C/C + +, and logic cannot be modified after compiling because C/C + + is a static language. Because the driving scene is a frequently-modified service, if the driving scene is developed in a conventional mode, when the driving scene logic is changed or modified, the driving scene logic is required to be compiled and released again, the modification process is complex, and the driving scene logic is required to be changed or modified by a person with programming capability.
Disclosure of Invention
The embodiment of the application provides a driving scene display method and device, electronic equipment and a storage medium, which can flexibly modify scene logic, realize dynamic configuration and customization of a driving scene, and greatly reduce the time for designers to verify the scene.
In a first aspect, an embodiment of the present application provides a driving scene display method, where the method includes:
judging whether the actual value of the target scene trigger parameter of the vehicle changes or not;
if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameters to obtain an operation result, and determining the target driving scene according to the operation result;
and sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
In a second aspect, an embodiment of the present application provides a driving scene display device, where the device includes:
the parameter change judging module is used for judging whether the actual value of the target scene trigger parameter of the vehicle changes or not;
the driving scene determining module is used for operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result and determining a target driving scene according to the operation result if the driving scene determining module changes;
and the driving scene display module is used for sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
In a third aspect, an embodiment of the present application provides an electronic device, including:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors implement the driving scenario presentation method according to any embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the driving scenario presenting method according to any embodiment of the present application.
The embodiment of the application provides a driving scene display method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: judging whether the actual value of the target scene trigger parameter of the vehicle changes or not; if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene triggering parameters to obtain an operation result, and determining the target driving scene according to the operation result; and sending the scene information of the target driving scene to the scene display component so as to display the scene information of the target driving scene by using the scene display component. The method comprises the steps of compiling scene trigger parameters, reference values and dynamic language logic codes of scene logic in a scene configuration file, and executing the dynamic language logic codes through a scene logic interpreter in a static language program to determine the current target driving scene of the vehicle. The method separates the scene configuration file from the static language program, not only solves the problems that software personnel are required to participate and software needs to be released again when the scene logic is modified in the traditional mode, can flexibly modify the scene logic without modifying the static language program, but also can independently carry out the change or modification work of the driving scene, and is handed to a scene designer or a user without programming capability to define the scene, thereby realizing the dynamic configuration and self-definition of the driving scene, reducing the probability of BUG generation, and greatly reducing the time for the designer to verify the scene.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a first flowchart of a driving scenario displaying method according to an embodiment of the present disclosure;
fig. 2 is a second flowchart of a driving scenario displaying method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a driving scene display device according to an embodiment of the present application;
fig. 4 is a block diagram of an electronic device for implementing a driving scene display method according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
Fig. 1 is a first flowchart of a driving scene displaying method according to an embodiment of the present disclosure, which is applicable to determining and displaying a driving scene of a vehicle. The driving scene display method provided by the embodiment of the present application may be implemented by the driving scene display apparatus provided by the embodiment of the present application, and the apparatus may be implemented in a software and/or hardware manner and integrated in an electronic device implementing the method. (preferably, the electronic device in the embodiment of the present application may be a control device for executing the driving scenario displaying method of the present application, where the control device is configured with a static language program (i.e., software) and a dynamic language logic code, the static language program is embedded with a scenario logic interpreter, and the dynamic language logic code is stored in a scenario configuration file.
Referring to fig. 1, the method of the present embodiment includes, but is not limited to, the following steps:
and S110, judging whether the actual value of the target scene trigger parameter of the vehicle changes or not.
The scene trigger parameters refer to driving data of the vehicle triggering scene logic, the number of the scene trigger parameters is related to the number of the driving scenes, and one scene trigger parameter can correspond to a plurality of driving scenes, for example, when the scene trigger parameter is the speed of the vehicle, the driving scenes can be a low-speed scene, a normal-speed scene or a high-speed scene; a driving scenario may correspond to a plurality of scenario trigger parameters, such as: when the driving scene is a high-speed scene, the scene trigger parameters may be speed, wheel rotation speed or oil consumption per minute. The target scene trigger parameters are one or more of the scene trigger parameters.
In the present embodiment, the actual values of the scene trigger parameters of the vehicle may be stored in the vehicle communication module. And the static language program can read the actual value of the target scene trigger parameter from the vehicle communication module in real time in the running process and analyze whether the actual value changes within a preset time period. For example: and determining that the actual value of the target scene trigger parameter changes if the rate of the actual value changes within the preset time period or is greater than the preset value.
And S120, if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result, and determining the target driving scene according to the operation result.
In the embodiment of the application, if it is determined that the actual value of the target scene trigger parameter changes, the static language program calls the scene logic interpreter to execute the scene logic according to the scene configuration file and the target scene trigger parameter, so as to obtain the operation result of the scene logic interpreter, and determine the current target driving scene of the vehicle according to the operation result.
In one embodiment, the scene configuration file includes a scene trigger parameter configured for each driving scene, a dynamic language logic code (i.e., scene logic) configured for each driving scene, and a reference value for the scene trigger parameter.
Further, operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result, including: acquiring a reference value and a dynamic language logic code corresponding to a target scene trigger parameter from a scene configuration file; calling a scene logic interpreter to run a dynamic language logic code according to the actual value and the reference value so as to determine the difference between the actual value and the reference value; and determining the operation result according to the difference. The difference between the actual value and the reference value may be a magnitude relationship, or may be whether a difference between the actual value and the reference value is within a preset range.
In another embodiment, the scene configuration file comprises a scene trigger parameter configured for each driving scene, a dynamic language logic code configured for each driving scene, and an association parameter having an association relationship with the scene trigger parameter.
Further, operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result, including: acquiring associated parameters and dynamic language logic codes corresponding to target scene trigger parameters from a scene configuration file; acquiring an actual value of the associated parameter; calling a scene logic interpreter to run a dynamic language logic code according to the actual value of the target scene trigger parameter and the actual value of the associated parameter so as to determine the difference between the actual value of the target scene trigger parameter and the actual value of the associated parameter; and determining the operation result according to the difference. Wherein the number of associated parameters may be one or more.
It should be noted that the two embodiments listed above are merely illustrative of step S120 (i.e., the scene logic interpreter is operated according to the scene configuration file and the target scene trigger parameter to obtain the operation result), and other embodiments may be included besides the two embodiments, and the other embodiments also belong to the scope of protection of the present application.
The reasons for configuring the scene trigger parameters and the scene logic are: since the types of the vehicle driving data are numerous, in order to improve the recognition performance of the vehicle driving scenes, it is necessary to indicate to each driving scene which data change will trigger the scene logic of the driving scene, and therefore, it is necessary to configure the scene triggering parameters and the scene logic of the driving scene. For example: when the driving scene is an overspeed scene, the scene triggering parameter is speed ', the reference value of the scene triggering parameter is speed limit value ' speed _ limit ', and the scene logic is ' ″ speed ' > ' speed _ limit '. And triggering scene logic when the speed changes in the overspeed scene, judging whether the overspeed scene enters the overspeed scene according to a scene logic expression, and ignoring the change of the rest data. An example of the configuration is as follows:
Figure BDA0003514151140000061
when the static language program is started, the scene configuration file needs to be read, the configured scene logic is stored in the corresponding driving scene, and then data is subscribed for the driving scene according to the configured scene trigger parameters, for example, the scene trigger parameters can be subscribed. When the actual value of the scene trigger parameter (e.g., 'speed') changes after subscription, the static language program calls the scene logic interpreter to read the reference value (e.g., 'speed _ limit') and the scene logic (e.g., 'speed' > 'speed _ limit') of the scene trigger parameter in the scene configuration file by using the scene logic interpreter. In addition, when the scene configuration file is read, a scene name (e.g., "overviewed limit") is added to the software scene list. And explaining the scene logic of the driving scene through a scene logic interpreter to obtain an operation result. When the operation result is true, the vehicle enters the driving scene, and when the operation result is false, the vehicle does not enter the driving scene.
In the prior art, writing scene trigger parameters, reference values and scene logics through a static language causes inconvenience in changing or modifying a driving scene, a flow is complex, and a person with programming capability is required to change or modify the driving scene. In order to improve the defect, the dynamic language logic codes of the scene trigger parameters, the reference values and the scene logic are written in the scene configuration file, so that not only can the driving scene be quickly modified, but also the changing or modifying work of the driving scene can be independent and delivered to a scene designer or a user without programming capability for self-definition, and the dynamic configuration and the self-definition of the driving scene can be realized.
Illustratively, if the subsequent driving scene is changed, the static language program is not required to be modified, and only the scene logic in the scene configuration file is required to be modified. For example: if the scene logic of the overspeed scene needs to be modified to be larger than 10% of the reference value, the original scene logic is modified to be '″ speed _ limit 0.1' ″, and whether the overspeed scene is entered or not is determined according to the new scene logic. Similarly, as long as the rule is mastered, a scene designer without programming capability can directly modify the logic statement to modify the scene logic without software personnel, so that the design verification efficiency is improved.
S130, sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
In the embodiment of the application, the scene logic interpreter runs the dynamic language logic code corresponding to the target scene trigger parameter, when the running result is true, the driving scene needs to be updated when the vehicle enters the target driving scene, and the scene information of the target driving scene is sent to the scene display component, so that the scene display component can make display adjustment according to the scene information. For example: after the overspeed scene is triggered, changing the value of the scene name 'overviewed limit' corresponding to the overspeed scene into true to indicate that the overspeed scene enters at the moment, detecting the values of all the scene names by the scene display assembly, and when the value of the 'overviewed limit' is found to be changed into true, judging that the overspeed scene enters at the moment, modifying the display color of a speed meter in the scene display assembly to remind a user of overspeed. The scene display component can be a UI (user interface) configured on a vehicle-mounted terminal on the vehicle; the display mode of the scene information of the target driving scene is not particularly limited, and may be changing the display color of the meter (such as the color of the speed meter), or displaying a scene animation.
According to the technical scheme provided by the embodiment, whether the actual value of the target scene trigger parameter of the vehicle changes or not is judged; if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene triggering parameters to obtain an operation result, and determining the target driving scene according to the operation result; and sending the scene information of the target driving scene to the scene display component so as to display the scene information of the target driving scene by using the scene display component. The method comprises the steps of compiling scene trigger parameters, reference values and dynamic language logic codes of scene logic in a scene configuration file, and executing the dynamic language logic codes through a scene logic interpreter in a static language program to determine the current target driving scene of the vehicle. The method separates the scene configuration file from the static language program, solves the problems that software personnel need to participate and software needs to be released again when the scene logic is modified in a traditional mode, can flexibly modify the scene logic without modifying the static language program, can independently carry out the change or modification work of the driving scene, is handed to a scene designer or a user without programming capacity to define, can realize the dynamic configuration and the self-definition of the driving scene, reduces the probability of BUG generation, and greatly reduces the time for the designer to verify the scene.
Example two
Fig. 2 is a second flowchart of the driving scenario display method according to the embodiment of the present application. The embodiment of the application is optimized on the basis of the embodiment, and specifically optimized as follows: the present embodiment explains the determination process of the scene trigger parameter in detail.
Referring to fig. 2, the method of the present embodiment includes, but is not limited to, the following steps:
s210, obtaining driving data of the vehicle, searching whether the driving data exist in the scene configuration file, and if the driving data exist in the scene configuration file, determining the driving data as a scene trigger parameter.
In the embodiment of the present application, the driving data of the vehicle is of various types, and some of the driving data are data that can trigger the scene logic, and some of the driving data are data that cannot trigger the scene logic. The scene configuration file comprises scene trigger parameters configured for each driving scene. Therefore, the static language program can read the driving data of the vehicle in real time in the running process, search whether the driving data exist in the scene configuration file, and determine the driving data as the scene trigger parameters if the driving data exist in the scene configuration file.
And S220, judging whether the actual value of the target scene trigger parameter of the vehicle changes or not.
In the present embodiment, the actual values of the scene trigger parameters of the vehicle may be stored in the data storage module. And the static language program can read the actual value of the target scene trigger parameter from the data storage module in real time in the running process and analyze whether the actual value changes within a preset time period. For example: and determining that the actual value of the target scene trigger parameter changes if the rate of the actual value changes within the preset time period or is greater than the preset value.
And S230, if the change occurs, looking up a subscription driving scene list corresponding to the target scene trigger parameter from the scene configuration file, and operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result.
In the embodiment of the present application, one scene trigger parameter may correspond to a plurality of driving scenes, for example, when the scene trigger parameter is the speed of the vehicle, the driving scene may be a low speed scene, a normal speed scene, or a high speed scene. The subscribed driving scene list corresponding to each scene trigger parameter may be configured in the scene configuration file, for example, when the scene trigger parameter is the speed of the vehicle, the subscribed driving scene list includes a low-speed scene, a normal-speed scene, or a high-speed scene.
In the embodiment of the application, when the actual value of the target scene trigger parameter changes, the static language program calls the scene logic interpreter, and the scene logic of each driving scene subscribed to the driving scene list is operated one by one according to the scene configuration file and the target scene trigger parameter to obtain the operation result of each driving scene.
Preferably, the scene configuration file comprises a dynamic language logic code configured for each driving scene, the dynamic language logic code is compiled based on a preset grammar rule, and the dynamic language logic code has a modification function; the grammar rules include logical operational expressions, data names, and data values.
Wherein, the scene logic interpreter is used for interpreting the dynamic language logic code in the execution scene configuration file. The operation of the scene logic interpreter mainly comprises two parts, namely lexical analysis and interpretation execution, wherein the lexical analysis is to divide the written scene logic character string into words, for example, the overspeed scene logic is '>' speed _ limit ', and the lexical analysis is to divide the sentence into three words, namely,' speed ',' > 'speed _ limit', and then to perform the interpretation and execution process. According to the defined syntax, 'speed' and 'speed _ limit' are taken as names, the names are predefined data, for example, the name of the storage of the agreed speed value is "speed", when the scene logic interpreter executes, the name is replaced according to the actual value of the corresponding name, if the 'speed' is 55 and the 'speed _ limit' is 60, the statement is finally 55 > 60, the execution result is false, the overspeed scene at this time is judged to be false, that is, the scene is not changed, and the scene logic interpreter completes the interpretation process of the overspeed scene logic.
The compiling of the dynamic language logic code and the scene logic interpreter is an important ring of the scheme, the grammar rule of the dynamic language logic code is required to be simple and easy to understand, the entry is convenient, and only data operation and logic operation are needed in the part. Therefore, the syntax rule may include statements (e.g., Statement), expressions (e.g., Expression), and basic composition units (Primary) of operations (e.g., Term, SubTerm), where Primary may be a data name or a data value Statement is a logical operation between expressions, and Expression may be a Statement with equal comparison size between terms or an Expression with parentheses; term and SubTerm are calculations between Primary.
Illustratively, based on the above grammar rules, the sentence form may be "'speed > 0AND (' speed '>' speed _ limit ')'", "'distance' < 100+ 1", AND the like.
The writing process of the scene logic interpreter is as follows: writing a scene logic interpreter by using C/C + + according to the above grammar rules, the scene logic interpreter firstly reads an input sentence character string, and divides the character string into token words, each token word has a corresponding type, for example, Number is a Number, Name is an identifier, plus, minus, and remainder operations are represented, and ═ | is represented by! The process is lexical analysis, where comparator is used as the result of comparison, AND OR is used as the logical operator. And after lexical analysis is completed, the character string becomes a string of tokens, then the tokens are read one by one according to the grammar, whether the tokens conform to the grammar is analyzed, and the process is grammar analysis. And after the grammatical analysis is completed, explaining and executing, wherein the Name corresponds to the ID of one internal data, the ID data is taken out through the Name and is compared with other numbers, after the comparison is completed, the logical operation is completed, the comparison result is output to a service logic layer, and after the whole process of the scene logic is finished, the process is executed again when the scene logic is triggered next time.
And S240, determining a target driving scene from the subscribed driving scene list according to the operation result.
For example, when the running result of the overspeed scene is true, the vehicle enters the overspeed scene, and when the running result of the low-speed scene is true, the vehicle enters the low-speed scene.
Optionally, the static language program may read driving data of the vehicle in real time during the running process, analyze and store the driving data in different names, for example, the speed data is analyzed after being read, and store in a value of a scene trigger parameter 'speed', a scene logic may be triggered when the scene trigger parameter changes, and the static language program may use a scene logic interpreter to execute the scene logic after the triggering. If the speed value is read to be 61 at the moment and is different from the speed value read last time, the speed value 61 is stored in 'speed', a subscription driving scene list of the 'speed' is checked, an overspeed scene is found, then a scene logic interpreter is called to execute the scene logic of the overspeed scene, and if the reference value is 60 at the moment, the scene returns to true after the overspeed scene logic 'speed' > 'speed _ limit' is executed, so that the overspeed scene is indicated to be entered.
And S250, sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
According to the technical scheme provided by the embodiment, the driving data of the vehicle is acquired, whether the driving data exists in the scene configuration file is searched, and if the driving data exists, the driving data is determined as the scene trigger parameters; judging whether the actual value of the target scene trigger parameter of the vehicle changes or not; if the target scene triggering parameters change, looking up a subscription driving scene list corresponding to the target scene triggering parameters from the scene configuration file, and operating a scene logic interpreter according to the scene configuration file and the target scene triggering parameters to obtain an operation result; determining a target driving scene from a subscription driving scene list according to an operation result; and sending the scene information of the target driving scene to the scene display component so as to display the scene information of the target driving scene by using the scene display component. The method comprises the steps of compiling scene trigger parameters, reference values and dynamic language logic codes of scene logic in a scene configuration file, and executing the dynamic language logic codes through a scene logic interpreter in a static language program to determine the current target driving scene of the vehicle. The method and the device solve the problems that software personnel are required to participate and software needs to be reissued when scene logic is modified in a traditional mode, the scene configuration file is separated from the static language program, flexible modification of the scene logic can be achieved, the static language program does not need to be modified, meanwhile, the change or modification work of the driving scene can be independent, and the driving scene is handed to a scene designer or a user without programming capability to be defined, so that dynamic configuration and self-definition of the driving scene can be achieved, the probability of BUG occurrence is reduced, and meanwhile, the time for the designer to verify the scene is greatly reduced.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a driving scene display apparatus according to an embodiment of the present application, and as shown in fig. 3, the apparatus 300 may include:
a parameter change determining module 310, configured to determine whether an actual value of a target scene trigger parameter of the vehicle changes;
a driving scenario determining module 320, configured to, if the driving scenario changes, operate the scenario logic interpreter according to the scenario configuration file and the target scenario trigger parameter to obtain an operation result, and determine a target driving scenario according to the operation result;
the driving scene display module 330 is configured to send the scene information of the target driving scene to a scene display component, so as to display the scene information of the target driving scene by using the scene display component.
Further, the driving scene display device may further include: a parameter determination module;
the parameter determining module is used for acquiring driving data of the vehicle and searching whether the driving data exists in the scene configuration file before judging whether the actual value of the target scene trigger parameter of the vehicle changes; and if so, determining the driving data as a scene trigger parameter.
Optionally, the scene configuration file includes a scene trigger parameter configured for each driving scene, a dynamic language logic code configured for each driving scene, and a reference value of the scene trigger parameter.
Further, the driving scenario determination module 320 may be specifically configured to: acquiring a reference value corresponding to the target scene trigger parameter and the dynamic language logic code from the scene configuration file; calling the scene logic interpreter to run the dynamic language logic code according to the actual value and the reference value so as to determine the difference between the actual value and the reference value; and determining the operation result according to the difference.
Optionally, the scene configuration file includes a scene trigger parameter configured for each driving scene, a dynamic language logic code configured for each driving scene, and an association parameter having an association relationship with the scene trigger parameter.
Further, the driving scenario determination module 320 may be further specifically configured to: acquiring the associated parameters and the dynamic language logic codes corresponding to the target scene trigger parameters from the scene configuration file; acquiring an actual value of the correlation parameter; calling the scene logic interpreter to run the dynamic language logic code according to the actual value of the target scene trigger parameter and the actual value of the associated parameter so as to determine the difference between the actual value of the target scene trigger parameter and the actual value of the associated parameter; and determining the operation result according to the difference.
Optionally, the dynamic language logic code is written based on a preset grammar rule, and the dynamic language logic code has a modification function; the grammar rules include logical operational expressions, data names, and data values.
Further, the driving scenario determination module 320 may be further specifically configured to: before the scene logic interpreter is operated according to the scene configuration file and the target scene trigger parameters to obtain an operation result, a subscription driving scene list corresponding to the target scene trigger parameters is consulted from the scene configuration file; correspondingly, the target driving scene is determined from the subscription driving scene list according to the operation result.
The driving scene display device provided by the embodiment can be applied to the driving scene display method provided by any embodiment, and has corresponding functions and beneficial effects.
Example four
Fig. 4 is a block diagram of an electronic device for implementing a driving scenario presentation method according to an embodiment of the present application, and fig. 4 shows a block diagram of an exemplary electronic device suitable for implementing an embodiment of the present application. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and applicable scope of the embodiments of the present application. The electronic device can be a smart phone, a tablet computer, a notebook computer, a vehicle-mounted terminal, a wearable device and the like.
As shown in fig. 4, electronic device 400 is embodied in the form of a general purpose computing device. The components of electronic device 400 may include, but are not limited to: one or more processors or processing units 416, a memory 428, and a bus 418 that couples the various system components including the memory 428 and the processing unit 416.
Bus 418 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 400 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 400 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 428 can include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)430 and/or cache memory 432. The electronic device 400 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 418 by one or more data media interfaces. Memory 428 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A program/utility 440 having a set (at least one) of program modules 442 may be stored, for instance, in memory 428, such program modules 442 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 442 generally perform the functions and/or methods described in embodiments herein.
The electronic device 400 may also communicate with one or more external devices 414 (e.g., keyboard, pointing device, display 424, etc.), with one or more devices that enable a user to interact with the electronic device 400, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 400 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 422. Also, electronic device 400 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) through network adapter 420. As shown in FIG. 4, network adapter 420 communicates with the other modules of electronic device 400 over bus 418. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in conjunction with electronic device 400, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
The processing unit 416 executes programs stored in the memory 428 to perform various functional applications and data processing, such as implementing a driving scene display method provided in any embodiment of the present application.
EXAMPLE five
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program (or referred to as computer-executable instructions) is stored, where the program, when executed by a processor, can be used to execute the driving scenario displaying method provided in any of the above embodiments of the present application.
The computer storage media of the embodiments of the present application may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, or a conventional procedural programming language such as the "C" programming language or a similar programming language. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).

Claims (11)

1. A driving scene display method is applied to a vehicle, wherein the vehicle is provided with a scene configuration file and a scene logic interpreter, and the method comprises the following steps:
judging whether the actual value of the target scene trigger parameter of the vehicle changes or not;
if the target driving scene changes, operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameters to obtain an operation result, and determining the target driving scene according to the operation result;
and sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
2. The driving scenario presentation method of claim 1, further comprising, before determining whether the actual value of the target scenario trigger parameter of the vehicle changes:
acquiring driving data of the vehicle, and searching whether the driving data exists in the scene configuration file;
and if so, determining the driving data as a scene trigger parameter.
3. The driving scenario presentation method according to claim 1, wherein the scenario configuration file includes a scenario trigger parameter configured for each driving scenario, a dynamic language logic code configured for each driving scenario, and a reference value of the scenario trigger parameter.
4. The driving scenario presentation method of claim 3, wherein the operating the scenario logic interpreter according to the scenario configuration file and the target scenario trigger parameter to obtain an operating result comprises:
acquiring a reference value corresponding to the target scene trigger parameter and the dynamic language logic code from the scene configuration file;
calling the scene logic interpreter to run the dynamic language logic code according to the actual value and the reference value so as to determine the difference between the actual value and the reference value;
and determining the operation result according to the difference.
5. The driving scenario presentation method according to claim 1, wherein the scenario configuration file includes a scenario trigger parameter configured for each driving scenario, a dynamic language logic code configured for each driving scenario, and an association parameter having an association relationship with the scenario trigger parameter.
6. The driving scenario presentation method of claim 5, wherein the operating the scenario logic interpreter according to the scenario configuration file and the target scenario trigger parameter to obtain an operating result comprises:
acquiring the associated parameters and the dynamic language logic codes corresponding to the target scene trigger parameters from the scene configuration file;
acquiring an actual value of the correlation parameter;
calling the scene logic interpreter to run the dynamic language logic code according to the actual value of the target scene trigger parameter and the actual value of the associated parameter so as to determine the difference between the actual value of the target scene trigger parameter and the actual value of the associated parameter;
and determining the operation result according to the difference.
7. The driving scenario presentation method according to claim 3 or 5, wherein the dynamic language logic code is written based on a preset grammar rule, the dynamic language logic code having a modification function; the grammar rules include logical operational expressions, data names, and data values.
8. The driving scenario presentation method of claim 1, wherein before the executing the scenario logic interpreter according to the scenario configuration file and the target scenario trigger parameter to obtain an execution result, the method further comprises:
looking up a subscribed driving scene list corresponding to a target scene trigger parameter from the scene configuration file;
correspondingly, the determining a target driving scene according to the operation result includes:
and determining the target driving scene from the subscribed driving scene list according to the operation result.
9. A driving scenario presentation apparatus integrated into a vehicle, the vehicle having a scenario profile and a scenario logic interpreter, the apparatus comprising:
the parameter change judging module is used for judging whether the actual value of the target scene trigger parameter of the vehicle changes or not;
the driving scene determining module is used for operating the scene logic interpreter according to the scene configuration file and the target scene trigger parameter to obtain an operation result and determining a target driving scene according to the operation result if the driving scene determining module changes;
and the driving scene display module is used for sending the scene information of the target driving scene to a scene display component so as to display the scene information of the target driving scene by using the scene display component.
10. An electronic device, characterized in that the electronic device comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the driving scenario presentation method of any of claims 1-8.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the driving scenario presentation method of any one of claims 1 to 8.
CN202210160003.6A 2022-02-22 2022-02-22 Driving scene display method and device, electronic equipment and storage medium Pending CN114546383A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210160003.6A CN114546383A (en) 2022-02-22 2022-02-22 Driving scene display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210160003.6A CN114546383A (en) 2022-02-22 2022-02-22 Driving scene display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114546383A true CN114546383A (en) 2022-05-27

Family

ID=81678418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210160003.6A Pending CN114546383A (en) 2022-02-22 2022-02-22 Driving scene display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114546383A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024119678A1 (en) * 2022-12-07 2024-06-13 深圳海星智驾科技有限公司 Automatic driving working method and apparatus for engineering machine, electronic device, and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111783225A (en) * 2020-06-28 2020-10-16 北京百度网讯科技有限公司 Method and device for processing scenes in simulation system
CN112069643A (en) * 2019-05-24 2020-12-11 北京车和家信息技术有限公司 Automatic driving simulation scene generation method and device
CN112560253A (en) * 2020-12-08 2021-03-26 中国第一汽车股份有限公司 Method, device and equipment for reconstructing driving scene and storage medium
CN112748977A (en) * 2020-12-09 2021-05-04 北京梧桐车联科技有限责任公司 Method, device and system for displaying driving scene
CN113221359A (en) * 2021-05-13 2021-08-06 京东鲲鹏(江苏)科技有限公司 Simulation scene generation method, device, equipment and storage medium
CN113391801A (en) * 2021-06-11 2021-09-14 斑马网络技术有限公司 Recommendation engine architecture based on cloud service
CN113859264A (en) * 2021-09-17 2021-12-31 阿波罗智联(北京)科技有限公司 Vehicle control method, device, electronic device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112069643A (en) * 2019-05-24 2020-12-11 北京车和家信息技术有限公司 Automatic driving simulation scene generation method and device
CN111783225A (en) * 2020-06-28 2020-10-16 北京百度网讯科技有限公司 Method and device for processing scenes in simulation system
CN112560253A (en) * 2020-12-08 2021-03-26 中国第一汽车股份有限公司 Method, device and equipment for reconstructing driving scene and storage medium
CN112748977A (en) * 2020-12-09 2021-05-04 北京梧桐车联科技有限责任公司 Method, device and system for displaying driving scene
CN113221359A (en) * 2021-05-13 2021-08-06 京东鲲鹏(江苏)科技有限公司 Simulation scene generation method, device, equipment and storage medium
CN113391801A (en) * 2021-06-11 2021-09-14 斑马网络技术有限公司 Recommendation engine architecture based on cloud service
CN113859264A (en) * 2021-09-17 2021-12-31 阿波罗智联(北京)科技有限公司 Vehicle control method, device, electronic device and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024119678A1 (en) * 2022-12-07 2024-06-13 深圳海星智驾科技有限公司 Automatic driving working method and apparatus for engineering machine, electronic device, and system

Similar Documents

Publication Publication Date Title
CN109002510B (en) Dialogue processing method, device, equipment and medium
JP2021193606A (en) Operator registration processing method, device and electronic apparatus based on deep learning
US20120158742A1 (en) Managing documents using weighted prevalence data for statements
KR20210041546A (en) Method and apparatus for determining causality, electronic device and storage medium
CN113296786B (en) Data processing method, device, electronic equipment and storage medium
CN111124371A (en) Game-based data processing method, device, equipment and storage medium
CN113127050B (en) Application resource packaging process monitoring method, device, equipment and medium
US20210350090A1 (en) Text to visualization
CN111309215A (en) Processing method, device, equipment and storage medium of sliding list in Unity
CN112214155B (en) View information playing method, device, equipment and storage medium
JP2021128779A (en) Method, device, apparatus, and storage medium for expanding data
CN112506854A (en) Method, device, equipment and medium for storing page template file and generating page
CN116360735A (en) Form generation method, device, equipment and medium
CN114546383A (en) Driving scene display method and device, electronic equipment and storage medium
KR20140116438A (en) Graphical representation of an order of operations
CN113408070B (en) Engine parameter determining method, device, equipment and storage medium
CN114911541B (en) Processing method and device of configuration information, electronic equipment and storage medium
CN116149632A (en) Business logic customizing method and device, computer equipment and storage medium
CN115586899A (en) Local refreshing method and device based on Flutter dynamic page
CN113590221B (en) Method and device for detecting number of shader variants, electronic equipment and storage medium
CN112579096B (en) Method, device, equipment and medium for compiling and loading applet startup file
CN114510334A (en) Class instance calling method and device, electronic equipment and automatic driving vehicle
CN114528509A (en) Page display processing method and device, electronic equipment and storage medium
CN110085237B (en) Recovery method, device and equipment of interactive process
CN113312025A (en) Component library generation method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220527

RJ01 Rejection of invention patent application after publication