CN112383674A - Data and video synchronous display method, device, vehicle and medium - Google Patents

Data and video synchronous display method, device, vehicle and medium Download PDF

Info

Publication number
CN112383674A
CN112383674A CN202011239400.XA CN202011239400A CN112383674A CN 112383674 A CN112383674 A CN 112383674A CN 202011239400 A CN202011239400 A CN 202011239400A CN 112383674 A CN112383674 A CN 112383674A
Authority
CN
China
Prior art keywords
preset
data
video
vehicle
result data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011239400.XA
Other languages
Chinese (zh)
Inventor
祝铭含
曲白雪
祁旭
白天晟
杨航
王祎男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202011239400.XA priority Critical patent/CN112383674A/en
Publication of CN112383674A publication Critical patent/CN112383674A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a method, a device, a vehicle and a medium for synchronously displaying data and video. The method comprises the following steps: acquiring intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length; acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information; and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video. According to the technical scheme, the intermediate variable, the result data and the scene video corresponding to the sensor data obtained by fusing the sensor data are obtained respectively, the intermediate variable, the result data and the sensor data are synchronously displayed, the corresponding relation between the result data and the scene video can be determined, and the visualization of the result data is realized. And then can accurate location operation process time of taking place the mistake, be convenient for the technical staff to discover the process that the mistake took place and come up again, be convenient for the technical staff to observe the debugging parameter.

Description

Data and video synchronous display method, device, vehicle and medium
Technical Field
The embodiment of the invention relates to an intelligent driving technology, in particular to a method, a device, a vehicle and a medium for synchronously displaying data and video.
Background
With the continuous development of intelligent driving technology, multiple sensors have become the standard of unmanned vehicles. Data returned by the multiple sensors are displayed in a table form after being fused, and the relative position relation between the data and the main vehicle, the relative position relation between the data and the data, and the time synchronization between the data and the scene video cannot be vividly displayed to developers.
In the prior art, a large amount of real-time data acquired by a sensor generates a large amount of result data after running in an algorithm model, the result data is observed by human eyes, the corresponding relation between the result data and a scene video cannot be determined, and the result data cannot be visualized. The time of the error in the operation process can not be accurately positioned, so that technicians are not convenient to find the error and reproduce the error, and the technicians are not convenient to observe debugging parameters.
Disclosure of Invention
The invention provides a method, a device, a vehicle and a medium for synchronously displaying data and videos, which are convenient for determining the corresponding relation between result data and scene videos and realizing the visualization of the result data.
In a first aspect, an embodiment of the present invention provides a method for displaying data and video synchronously, where the method includes:
acquiring intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length;
acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information;
and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
In a second aspect, an embodiment of the present invention further provides a data and video synchronous display device, where the device includes: a first acquisition module, a second acquisition module, and a presentation module, wherein,
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring intermediate variables and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length;
the second acquisition module is used for acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information;
and the display module is used for controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
In a third aspect, an embodiment of the present invention further provides a vehicle, including:
one or more processors;
storage means for storing one or more programs;
the sensing device is used for acquiring sensing data in the running process of the vehicle;
the camera is used for acquiring a scene video of the running vehicle;
when executed by the one or more processors, cause the one or more processors to implement the method of synchronized data and video display according to the first aspect.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform the method for synchronized display of data and video according to the first aspect.
The method comprises the steps of obtaining intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length; acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information; and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video. The problem that the corresponding relation between the result data and the scene video is not convenient to determine is solved, and the result data is visualized. And then can accurate location operation process time of taking place the mistake, be convenient for the technical staff to discover the process that the mistake took place and come up again, be convenient for the technical staff to observe the debugging parameter.
Drawings
Fig. 1 is a flowchart of a method for displaying data and video synchronously according to an embodiment of the present invention;
fig. 2 is a schematic display diagram of a method for displaying data and video synchronously according to an embodiment of the present invention;
fig. 3 is a flowchart of a method for displaying data and video synchronously according to a second embodiment of the present invention;
fig. 4 is a flowchart illustrating an implementation of a method for displaying data and video synchronously according to a second embodiment of the present invention;
fig. 5 is a structural diagram of a data and video synchronous display device according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of a vehicle according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like. In addition, the embodiments and features of the embodiments in the present invention may be combined with each other without conflict.
Example one
Fig. 1 is a flowchart of a data and video synchronous display method according to an embodiment of the present invention, where the present embodiment is applicable to a situation where data and video are synchronously displayed during a vehicle operation process, and the method may be executed by a vehicle system, and specifically includes the following steps:
and step 110, acquiring intermediate variable and result data which are processed by a sensor data fusion algorithm in the vehicle in a preset simulation time length.
In particular, sensor data returned by multiple sensors are synthesized through a fusion algorithm, and consistent explanation and description of the running process of the vehicle can be obtained. In particular, the in-vehicle sensor data may be passed through a fusion algorithm to derive intermediate variable and result data. The intermediate variables may include redundant data or half-result data generated in the process of calculating result data based on a plurality of sensor data, and the like.
In addition, the sensor data fusion method may include: the method comprises a random method and an artificial intelligence method, wherein the random method can comprise a weighted average method, a Kalman filtering method, a D-S evidence reasoning method, a production rule and the like, and the artificial intelligence method can comprise a fuzzy logic reasoning method and an artificial neural network method.
The preset simulation duration may correspond to the start time and the end time of the operation of the data fusion algorithm model, and the start time and the end time of the operation of the data fusion algorithm model may be determined according to the start time and the end time of the scene video. The preset simulation duration can also be the time for operating the data fusion algorithm model once, and the single operation time can correspond to the acquisition time of a single scene video.
The method comprises the steps of obtaining an intermediate variable and result data in preset simulation duration, wherein the preset simulation duration can correspond to the time for obtaining a scene video, the intermediate variable, the result data and the scene video can correspond to each other in time, and synchronous display of the data and the video is facilitated.
And 120, acquiring a scene video of the vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information.
Specifically, a scene video of the vehicle can be recorded by the automobile data recorder, and the scene video with preset duration can be acquired from a scene video library acquired by the automobile data recorder in the method operation process. And obtaining a scene video with preset driving time length by editing the scene video in the scene video library.
In addition, the scene video of the vehicle may include a forward-looking scene video and a rearward-looking scene video.
As described above, the preset driving time and the preset simulation time are the same time information, so that the intermediate variable, the result data and the scene video at the same time can be conveniently acquired.
And step 130, controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
Specifically, the intermediate variable and the result data can be displayed in a first preset window of the same display interface, and the scene video can be displayed in a second preset window.
The intermediate variables and the resulting data may be read by display software, which may include MATLAB.
Fig. 2 is a schematic display diagram of a data and video synchronous display method according to an embodiment of the present invention, as shown in fig. 2, a first preset window of the MATLAB may display intermediate variable and result data, a sensor for acquiring data in a coordinate system may be an origin, and position information of the sensor for acquiring data in the intermediate variable and the result data is used as a horizontal coordinate and a vertical coordinate of the sensor for acquiring data. In order to distinguish the above-mentioned intermediate variables from the result data, the intermediate variables and the result data may be displayed in a coordinate system in different symbols or colors. For example, as shown in fig. 2, the diamond may represent the intermediate variable and the result data, the intermediate variable and the result data may be distinguished according to colors, and specific colors may be determined according to actual situations, which is not specifically limited herein. The square box may represent a sensor that collects data. The dashed track may represent the trajectory of the vehicle.
A second preset window of MATLAB may display a forward looking scene video of the vehicle. In addition, in order to more accurately grasp the specific situation of the vehicle operation, the scene video of the vehicle may include a front view scene video and a rear view scene video. The backview scene video of the vehicle may be displayed within a third preset window of the MATLAB.
In addition, when the display software includes MATLAB, intermediate variables and result data need to be type-converted prior to being displayed. Converting the data type of the data into a data type which can be identified by MATLAB, wherein the data type can specifically comprise xls, txt, dat, inp, asv, tiff, png, jpg, bmp, mp4, avi and the like.
In addition, as shown in fig. 2, the intermediate variables and the result data may also be displayed in two preset windows, respectively.
The embodiment of the invention provides a data and video synchronous display method, which comprises the steps of obtaining intermediate variable and result data which are processed by a sensor data fusion algorithm in a vehicle in a preset simulation time length; acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information; and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video. The problem that the corresponding relation between the result data and the scene video is not convenient to determine is solved, and the result data is visualized. And then can accurate location operation process time of taking place the mistake, be convenient for the technical staff to discover the process that the mistake took place and come up again, be convenient for the technical staff to observe the debugging parameter.
Example two
Fig. 3 is a flowchart of a data and video synchronous display method according to a second embodiment of the present invention, which is embodied on the basis of the second embodiment. In this embodiment, the method may further include:
and 310, acquiring intermediate variable and result data which are processed by a sensor data fusion algorithm in the vehicle in a preset simulation time length.
In one embodiment, step 310 may specifically include:
and inputting the sensor data acquired by the plurality of sensors into a preset algorithm model, and acquiring an output result comprising an intermediate variable and result data.
Specifically, the preset algorithm model may be constructed based on the sensor data fusion method described in the first embodiment, that is, may be constructed based on a random method and an artificial intelligence method. The preset algorithm model fuses sensor data acquired by a plurality of sensors to obtain intermediate variables and result data.
And intercepting intermediate variables and result data in an output result based on the preset simulation duration.
Specifically, the input of the preset algorithm model may include sensor data acquired by a plurality of sensors within a preset time period, and the output may include intermediate variables and result data within the preset time period.
In order to keep the time information of the intermediate variable, the result data and the scene video consistent, the intermediate variable and the result data of the preset simulation duration may be intercepted from the intermediate variable and the result data within the preset time.
In addition, in order to facilitate data acquisition and data processing, the preset simulation duration can also be the period duration of data acquisition of a plurality of sensors, and the sensor data of the preset simulation duration can be input into a preset algorithm model for processing, so that intermediate variables and result data of the preset simulation duration can be obtained.
And determining the variable names of the intermediate variable and the result data based on the variable names of the preset algorithm model.
In general, variable names of sensor data acquired by a sensor are inconsistent with variable names of a preset algorithm model, so that the variable names of the intermediate variable and the result data need to be determined according to the variable names of the preset algorithm model before the intermediate variable, the result data and the scene video are displayed on the same screen.
Correspondingly, if the data types of the intermediate variables and the result data obtained by the preset algorithm model are not matched with the display software, the data types of the intermediate variables and the result data need to be converted.
And 320, acquiring a scene video of the vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information.
In one embodiment, step 320 may specifically include:
and determining the time interval of the video frames according to the total frame number and the total time of the video.
Specifically, since the time interval of each frame of video is fixed, the total time of the video is divided by the total number of frames of the video, and the time interval of the video frames can be obtained. Wherein, the total frame number of the video can be determined by the frame rate of the video and the total time of the video.
And controlling the preset driving time length of the acquired scene video to be equal to the preset simulation time length according to the time interval of the video frames and the number of the video frames.
Specifically, the time interval and the video frame number may determine a duration of the scene video, that is, a preset driving duration of the scene video may be controlled by the video frame number of the scene video. The preset simulation duration may be determined by adjusting the cycle duration of the plurality of sensor acquisition sensors. And then the preset driving time length and the preset simulation time length can be controlled to be equal.
And acquiring a scene video of the running vehicle based on the preset running time.
Specifically, after the preset driving time is determined, the scene video of the preset driving time can be intercepted in the scene video library according to the preset driving time.
Accordingly, the intercepted scene video may also include a front view scene video and a rear view scene video.
And 330, controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
In one embodiment, step 330 may specifically include:
and displaying time information, the intermediate variable of preset simulation duration and the result data in a first preset window, wherein the origin is the position of a sensor for acquiring sensing data, and the horizontal and vertical coordinates are the relative positions of the vehicle and the sensor.
Specifically, the display positions of the time information and the intermediate variable are not particularly limited herein, and the time information and the intermediate variable may be displayed in the upper right corner of the coordinate system as shown in fig. 2. In addition, the coordinate system can also display the vehicle position information and the running track of the vehicle. The intermediate variable and the result data are displayed in the coordinate system, so that the result data can be visualized, a researcher can find out error data conveniently, and error information correction and the like can be further performed.
And displaying the scene video with preset driving time in a second preset window.
Specifically, a scene video of the preset driving time duration may be displayed in the second preset window. The scene video may include a forward view scene video and/or a rearward view scene video. When the scene video may include a front view scene video or a rear view scene video, the second preset window may display the front view scene video or the rear view scene video; when the scene video may include a front view scene video and a rear view scene video, the second preset window may display the front view scene video, and the third preset window may display the rear view scene video. The first preset window, the second preset window and the third preset window can be displayed simultaneously, the first preset window can be located on one side of the display window, and the second preset window and the third preset window can be located at the first end and the second end of the other side of the display window respectively.
And step 340, determining error information according to the intermediate variable, the result data and the display information of the scene video.
Specifically, if the intermediate variable and the result data are inconsistent or not corresponding to the current video frame, the intermediate variable or the result data obtained according to the preset algorithm model has error information, and may be determined as error information.
The intermediate variable, the result data and the video frame are synchronously displayed, so that the time when the error occurs is conveniently positioned, and the influence of the modified parameters on different application scenes of different algorithms is conveniently detected. The testing process of engineering personnel is greatly facilitated in the debugging process of the intelligent driving algorithm.
And 350, modifying the algorithm model parameters based on the error information, and continuously executing and acquiring intermediate variable and result data obtained by processing a sensor data fusion algorithm in the vehicle in a preset simulation time length.
Specifically, if there is error information, the algorithm model parameters may be modified based on the error information, so that the algorithm model more matches the current vehicle operation.
And 360, storing the video, the intermediate variable, the result data and the error information in an associated manner.
Specifically, video frames in a scene video and intermediate variable and result data corresponding to the video frames can be captured and stored. In addition, when there is error information, the error information may be associated with the video, the intermediate variable, and the result data and stored.
In one embodiment, a list or database of names and addresses of scene videos and sensor data may be created, and videos generated by corresponding all scene videos and sensor data may be obtained at a time by cyclically calling different scene videos and sensor data in the MATLAB.
According to the technical scheme of the embodiment, intermediate variable and result data obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length are obtained; acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information; and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video. And determining error information according to the intermediate variable, the result data and the display information of the video. And modifying the algorithm model parameters based on the error information, and continuously executing and acquiring intermediate variable and result data obtained by processing a sensor data fusion algorithm in the vehicle in a preset simulation time length. And storing the video, the intermediate variable, the result data and the error information in an associated manner. The problem that the corresponding relation between the result data and the scene video is not convenient to determine is solved, and the result data is visualized. And then can accurate location operation process time of taking place the mistake, be convenient for the technical staff to discover the process that the mistake took place and come up again, be convenient for the technical staff to observe the debugging parameter.
Fig. 4 is a flowchart of an implementation of a method for displaying data and video synchronously according to a second embodiment of the present invention, which exemplarily shows one implementation manner. As shown in figure 4 of the drawings,
and step 410, setting a preset simulation time length of the preset algorithm model and a preset driving time length of the acquired scene video, and enabling the preset simulation time length and the preset driving time length to be the same time information.
Specifically, the preset algorithm model and the acquired scene video are respectively operated in the same time period, so that the acquired intermediate variable, the acquired result data and the time information of the scene video correspond to each other, and the intermediate variable, the acquired structure data and the scene video are conveniently and synchronously displayed.
And 420, loading sensor data according to the preset simulation duration, and simultaneously presetting an algorithm model to operate so as to obtain intermediate variables and result data.
Specifically, the inputs of the pre-set algorithm model may include sensor data acquired by a plurality of sensors, and the outputs may include intermediate variables and result data.
And 430, acquiring a scene video within a preset driving time.
Specifically, a scene video corresponding to the preset driving time duration may be captured from a scene video library formed by the vehicle event data recorder.
And 440, displaying the intermediate variable and the result data in a first preset window of the MATLAB, and displaying the scene video in a second preset window of the MATLAB.
Specifically, the scene video displayed in the second preset window of the MATLAB may include a forward-view scene video, and the backward-view scene video may also be displayed in the third preset window of the MATLAB.
And step 450, storing the correlated intermediate variables, result data and scene videos based on a preset storage space.
Specifically, a list or database of names and addresses of scene videos and sensor data may be created, and videos generated by corresponding to all scene videos and sensor data may be obtained at a time by cyclically calling different scene videos and sensor data in the MATLAB.
EXAMPLE III
Fig. 5 is a structural diagram of a data and video synchronous display device according to a third embodiment of the present invention, where the device is applicable to a situation where sensor data and video are synchronously displayed during vehicle operation, so as to determine a correspondence between result data and scene video, and achieve visualization of the result data. The device may be implemented by software and/or hardware and is typically integrated into a vehicle system.
As shown in fig. 5, the apparatus includes: a first acquisition module 510, a second acquisition module 520, and a presentation module 530, wherein,
a first obtaining module 510, configured to obtain intermediate variables and result data processed by a sensor data fusion algorithm in a vehicle for a preset simulation duration;
the second obtaining module 520 is configured to obtain a scene video of a vehicle at a preset driving time, where the preset driving time and the preset simulation time are the same time information;
a display module 530, configured to control a first preset window and a second preset window to perform information display synchronously, where the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
The data and video synchronous display device provided by the embodiment obtains intermediate variable and result data processed by a preset simulation time length through a sensor data fusion algorithm in a vehicle; acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information; and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video. The problem that the corresponding relation between the result data and the scene video is not convenient to determine is solved, and the result data is visualized. And then can accurate location operation process time of taking place the mistake, be convenient for the technical staff to discover the process that the mistake took place and come up again, be convenient for the technical staff to observe the debugging parameter.
On the basis of the foregoing embodiment, the first obtaining module 510 is specifically configured to:
inputting sensor data acquired by a plurality of sensors into a preset algorithm model, and acquiring output results comprising intermediate variables and result data;
intercepting intermediate variables and result data in an output result based on a preset simulation duration;
and determining the variable names of the intermediate variable and the result data based on the variable names of the preset algorithm model.
On the basis of the foregoing embodiment, the second obtaining module 520 is specifically configured to:
determining the time interval of the video frames according to the total frame number and the total time of the video;
controlling the preset driving time length of the acquired scene video to be equal to the preset simulation time length according to the time interval of the video frames and the number of the video frames;
and acquiring a scene video of the running vehicle based on the preset running time.
On the basis of the foregoing embodiments, the display module 530 is specifically configured to:
displaying time information, the intermediate variable of preset simulation duration and the result data in a first preset window by taking a sensor for collecting sensing data as an origin and taking the relative position of a vehicle and the sensor as a horizontal coordinate and a vertical coordinate;
and displaying the scene video with preset driving time in a second preset window.
On the basis of the above embodiment, the system may further include:
the determining module is used for determining error information according to the intermediate variable, the result data and the display information of the video;
and the modification module is used for modifying the algorithm model parameters based on the error information and continuously executing and acquiring intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in the vehicle in a preset simulation time length.
On the basis of the above embodiment, the system may further include:
and the association module is used for associating and storing the video, the intermediate variable, the result data and the error information.
The data and video synchronous display device provided by the embodiment of the invention can execute the data and video synchronous display method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 6 is a schematic structural diagram of a vehicle according to a fourth embodiment of the present invention, as shown in fig. 6, the vehicle includes a processor 610, a memory 620, a sensing device 630, and a camera 640; the number of processors 610 in the vehicle may be one or more, and one processor 610 is taken as an example in fig. 6; the processor 610, the memory 620, the sensing device 630, and the camera 640 in the vehicle may be connected by a bus or other means, as exemplified by the bus connection in fig. 6.
The memory 620, as a computer-readable storage medium, may be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the data and video synchronous display method in the embodiment of the present invention (for example, the first obtaining module 510, the second obtaining module 520, and the presentation module 530 in the data and video synchronous display apparatus). The processor 610 executes various functional applications of the vehicle and data processing, i.e., implements the data and video synchronous display method described above, by executing software programs, instructions, and modules stored in the memory 520.
The memory 620 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 620 may further include memory located remotely from the processor 610, which may be connected to the vehicle over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
A sensing device 630, configured to acquire sensing data during operation of the vehicle; and the camera 640 is used for acquiring a scene video of the running vehicle.
The vehicle provided by the embodiment of the invention can execute the data and video synchronous display method provided by the embodiment, and has corresponding functions and beneficial effects.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a method for synchronous display of data and video, the method including:
acquiring intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length;
acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information;
and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the operations of the method described above, and may also perform related operations in the data and video synchronous display method provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It should be noted that, in the embodiment of the data and video synchronous display device, the included units and modules are only divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A method for displaying data and video synchronously, comprising:
acquiring intermediate variable and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length;
acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information;
and controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
2. The method for synchronously displaying data and video according to claim 1, wherein the step of obtaining the scene video of the vehicle within a preset driving time, wherein the preset driving time and the preset simulation time are the same time information, comprises the steps of:
determining the time interval of the video frames according to the total frame number and the total time of the video;
controlling the preset driving time length of the acquired scene video to be equal to the preset simulation time length according to the time interval of the video frames and the number of the video frames;
and acquiring a scene video of the running vehicle based on the preset running time.
3. The method for synchronously displaying data and video according to claim 1, wherein obtaining intermediate variable and result data processed by a sensor data fusion algorithm in a vehicle for a preset simulation time period comprises:
inputting sensor data acquired by a plurality of sensors into a preset algorithm model, and acquiring output results comprising intermediate variables and result data;
and intercepting intermediate variables and result data in an output result based on the preset simulation duration.
4. The method of claim 3, further comprising, after obtaining intermediate variables and result data in the output result based on a preset simulation step size:
and determining the variable names of the intermediate variable and the result data based on the variable names of the preset algorithm model.
5. The method for synchronously displaying data and video according to claim 1, wherein displaying the intermediate variable and the result data and the video in a first preset window and a second preset window respectively comprises:
displaying time information, the intermediate variable of preset simulation duration and the result data in a first preset window, wherein an original point is a sensor position for acquiring sensing data, and a horizontal coordinate and a vertical coordinate are relative positions of a vehicle and the sensor;
and displaying the scene video with preset driving time in a second preset window.
6. The method for synchronously displaying data and video according to claim 1, further comprising, after controlling the first preset window and the second preset window to be synchronously displayed, the steps of:
determining error information according to the intermediate variable, the result data and the display information of the scene video;
and modifying the algorithm model parameters based on the error information, and continuously executing and acquiring intermediate variable and result data obtained by processing a sensor data fusion algorithm in the vehicle in a preset simulation time length.
7. The method for synchronously displaying data and video according to claim 6, after controlling the first preset window and the second preset window to synchronously display information, further comprising:
and storing the scene video, the intermediate variable, the result data and the error information in an associated manner.
8. A data and video synchronized display device, comprising: a first acquisition module, a second acquisition module, and a presentation module, wherein,
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring intermediate variables and result data which are obtained by processing a sensor data fusion algorithm in a vehicle in a preset simulation time length;
the second acquisition module is used for acquiring a scene video of a vehicle at a preset driving time, wherein the preset driving time and the preset simulation time are the same time information;
and the display module is used for controlling a first preset window and a second preset window to synchronously display information, wherein the information displayed by the first preset window is the intermediate variable and the result data, and the information displayed by the second preset window is the scene video.
9. A vehicle, characterized in that the vehicle comprises:
one or more processors;
storage means for storing one or more programs;
the sensing device is used for acquiring sensing data in the running process of the vehicle;
the camera is used for acquiring a scene video of the running vehicle;
when executed by the one or more processors, cause the one or more processors to implement a method of synchronized data and video display as claimed in any one of claims 1 to 7.
10. A storage medium containing computer executable instructions for performing the method of simultaneous data and video display according to any one of claims 1-7 when executed by a computer processor.
CN202011239400.XA 2020-11-09 2020-11-09 Data and video synchronous display method, device, vehicle and medium Pending CN112383674A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011239400.XA CN112383674A (en) 2020-11-09 2020-11-09 Data and video synchronous display method, device, vehicle and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011239400.XA CN112383674A (en) 2020-11-09 2020-11-09 Data and video synchronous display method, device, vehicle and medium

Publications (1)

Publication Number Publication Date
CN112383674A true CN112383674A (en) 2021-02-19

Family

ID=74578861

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011239400.XA Pending CN112383674A (en) 2020-11-09 2020-11-09 Data and video synchronous display method, device, vehicle and medium

Country Status (1)

Country Link
CN (1) CN112383674A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113984091A (en) * 2021-11-17 2022-01-28 中国第一汽车股份有限公司 Positioning evaluation system, method, device and medium for vehicle driving

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127701A (en) * 2019-12-24 2020-05-08 武汉光庭信息技术股份有限公司 Vehicle failure scene detection method and system
CN111127651A (en) * 2020-03-31 2020-05-08 江苏广宇科技产业发展有限公司 Automatic driving test development method and device based on high-precision visualization technology
CN111563313A (en) * 2020-03-18 2020-08-21 交通运输部公路科学研究所 Driving event simulation reproduction method, system, equipment and storage medium
CN111783225A (en) * 2020-06-28 2020-10-16 北京百度网讯科技有限公司 Method and device for processing scenes in simulation system
CN111897305A (en) * 2020-06-02 2020-11-06 浙江吉利汽车研究院有限公司 Data processing method, device, equipment and medium based on automatic driving

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127701A (en) * 2019-12-24 2020-05-08 武汉光庭信息技术股份有限公司 Vehicle failure scene detection method and system
CN111563313A (en) * 2020-03-18 2020-08-21 交通运输部公路科学研究所 Driving event simulation reproduction method, system, equipment and storage medium
CN111127651A (en) * 2020-03-31 2020-05-08 江苏广宇科技产业发展有限公司 Automatic driving test development method and device based on high-precision visualization technology
CN111897305A (en) * 2020-06-02 2020-11-06 浙江吉利汽车研究院有限公司 Data processing method, device, equipment and medium based on automatic driving
CN111783225A (en) * 2020-06-28 2020-10-16 北京百度网讯科技有限公司 Method and device for processing scenes in simulation system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113984091A (en) * 2021-11-17 2022-01-28 中国第一汽车股份有限公司 Positioning evaluation system, method, device and medium for vehicle driving
CN113984091B (en) * 2021-11-17 2024-03-15 中国第一汽车股份有限公司 Positioning evaluation system, method, equipment and medium for vehicle running

Similar Documents

Publication Publication Date Title
NL2024682B1 (en) Assembly monitoring method and device based on deep learning, and readable storage medium
CN113168524A (en) Method and device for testing a driver assistance system
CN105729466B (en) Robot identifying system
CN112863234B (en) Parking space display method and device, electronic equipment and storage medium
CN106327461B (en) A kind of image processing method and device for monitoring
JPWO2019069436A1 (en) Monitoring system and monitoring method
CN108122245B (en) Target behavior description method and device and monitoring equipment
CN110084885A (en) A kind of cloud and image optimization method, device, equipment and storage medium
CN112383674A (en) Data and video synchronous display method, device, vehicle and medium
EP2675150B1 (en) Apparatus and method for providing image
CN111079535B (en) Human skeleton action recognition method and device and terminal
CN113021329B (en) Robot motion control method and device, readable storage medium and robot
CN112702877B (en) Cabinet interior remote monitoring and diagnosis method and system, cabinet device and storage medium
EP4016450A1 (en) State determination device and state determination method
JP6780983B2 (en) Image processing system
Wright et al. Fast in-situ mesh generation using orb-slam2 and openmvs
CN114679569A (en) Production line visual monitoring method and system based on three-dimensional modeling and storage medium
CN114882073A (en) Target tracking method and apparatus, medium, and computer device
KR20230104592A (en) Method and system for annotating sensor data
CN104113711A (en) Image Pickup Apparatus
CN112330977A (en) Automatic parking method and device
US20240129637A1 (en) Display screen flicker detection
CN115988273B (en) State monitoring method, device and medium for video monitoring device
CN116758494B (en) Intelligent monitoring method and system for vehicle-mounted video of internet-connected vehicle
WO2024103482A1 (en) Camera infrared video coloring method and apparatus, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210219

RJ01 Rejection of invention patent application after publication