WO2022111225A1 - 行车影像显示方法、装置及平台、存储介质、嵌入式设备 - Google Patents

行车影像显示方法、装置及平台、存储介质、嵌入式设备 Download PDF

Info

Publication number
WO2022111225A1
WO2022111225A1 PCT/CN2021/127903 CN2021127903W WO2022111225A1 WO 2022111225 A1 WO2022111225 A1 WO 2022111225A1 CN 2021127903 W CN2021127903 W CN 2021127903W WO 2022111225 A1 WO2022111225 A1 WO 2022111225A1
Authority
WO
WIPO (PCT)
Prior art keywords
cpu
display
environment information
pointer
information
Prior art date
Application number
PCT/CN2021/127903
Other languages
English (en)
French (fr)
Inventor
倪俊超
陈小强
周勃
王志伟
赵从富
Original Assignee
展讯半导体(成都)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 展讯半导体(成都)有限公司 filed Critical 展讯半导体(成都)有限公司
Publication of WO2022111225A1 publication Critical patent/WO2022111225A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/505Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the load
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present invention relates to the technical field of video display, and in particular, to a driving image display method, device and platform, storage medium and embedded device.
  • the traditional driving image system uses a single-channel camera installed at the rear of the vehicle, which can only cover the area with limited viewing angle around the rear of the vehicle. Look around the information around the vehicle, which will greatly increase the driver's driving safety hazards.
  • the new panoramic driving image system uses multi-channel cameras to perceive information about the surrounding environment of the vehicle, and obtains the vehicle driving through a 360-degree (or "360°") plane two-dimensional (2D) mode or a 360° three-dimensional (3D) stereo mode. information.
  • the output frame rate of the common camera (Camera) acquisition device on the market is 30 frames per second (Frames Per Second, referred to as fps), the frame interval is about 33 milliseconds (ms), and the output format is YUV.
  • YUV is a color coding method
  • "Y" represents the brightness (Luminance or Luma), that is, the grayscale value
  • "U" and "V” represent the chromaticity.
  • the current new panoramic driving image system can only pass the 360° plane 2D mode, cannot perceive the 360° 3D environment information of the vehicle, or can only display the 360° 3D mode, which is not compatible with the 2D plane mode.
  • the embedded device is affected by performance and power consumption, and cannot achieve full power and long-term operation on the vehicle platform. Therefore, running 2D/3D algorithms directly without optimization on the embedded platform will The frame rate cannot reach 30fps. At this time, displaying 2D and 3D images at the same time will reduce the display frame rate of 2D/3D, resulting in display delay, and the real-time display performance cannot be satisfied.
  • the delay may cause distance offset, which in turn will affect the driver's judgment and endanger driving safety.
  • the technical problem solved by the present invention is how to solve the problems of reduced display frame rate and display delay when the existing panoramic driving image system simultaneously displays 2D and 3D pictures.
  • an embodiment of the present invention provides a driving image display method, which includes: when 2D environment information and 3D environment information are displayed synchronously, acquiring 2D display parameters and 3D display parameters, wherein the 2D display parameters are display parameters. Parameters when 2D environment information is displayed, and the 3D display parameters are parameters when displaying 3D environment information; obtain operation information of the CPU, where the CPU includes multiple cores; according to the 2D display parameters, the 3D display parameters and/or The operating information of the CPU adjusts the frequency of the CPU and/or changes the execution core, where the execution core is the CPU core running the target thread, and the target thread is the thread for processing 2D environment information and/or 3D environment information .
  • the running information of the CPU includes the temperature of the CPU and/or the frequency of the CPU.
  • the 2D display parameters include a display frame rate for displaying 2D environment information
  • the 3D display parameters include a display frame rate for displaying 3D environment information.
  • the CPU core includes at least one large core and several small cores
  • adjusting the execution core includes: if the set CPU frequency value needs to be adjusted to the maximum frequency, and the temperature of the CPU is within the first preset time Increase the internal value, and change the execution core to a large core.
  • the method further includes: if the temperature of the CPU drops within a second preset time, changing the execution core to a small core.
  • freq new is the CPU frequency to be set
  • freq max is the maximum frequency
  • K is the excess factor
  • load is the current load of the CPU
  • load max is the maximum load of the CPU
  • P is the compensation factor
  • CPU temp is the CPU temperature
  • fps 2D/3D is the display frame rate of 2D/3D.
  • the method further includes: controlling the CPU cores not executing tasks to enter a light sleep mode.
  • the method before adjusting the execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, the method further includes: acquiring the 2D display parameters and the 3D display parameters from the application layer; to obtain the temperature of the CPU and the frequency of the CPU.
  • the target thread at least includes: a captured image acquisition thread, a 2D algorithm processing thread and a 3D algorithm processing thread.
  • the method further includes: acquiring captured images from several cameras of the vehicle through the captured image acquisition, and storing the acquired captured images in a buffer area, where the captured images include 2D images and 3D images; each time At least one frame of 2D image is captured from the buffer area and sent to the 2D algorithm processing thread, and the 2D algorithm processing thread processes the 2D image to obtain displayable 2D environment information;
  • the buffer area intercepts at least one frame of 3D image and sends it to the 3D algorithm processing thread, and the 3D algorithm processing thread processes the 3D image to obtain displayable 3D environment information.
  • the method further includes: defining at least 5 pointers, the pointers include a first pointer, a second pointer, a third pointer, a fourth pointer and a fifth pointer; intercepting at least When a frame of 2D image is taken, the first pointer and the second pointer respectively point to the start position and the stop position of the current capture; each time at least one frame of 3D image is captured from the buffer, the third pointer and the fourth pointer respectively point to The start position and stop position of this interception; the fifth pointer points to the maximum capacity position of the cache area, and the positions pointed to by the first pointer, the second pointer, the third pointer and the fourth pointer cannot exceed the fifth The location the pointer points to.
  • Zero-Copy technology in at least one of the following operations: acquiring a captured image, capturing a 2D image, and capturing a 3D image.
  • the method further includes: receiving control information sent by the vehicle-mounted central control and/or the gear controller; and synchronously displaying the 2D environment information and the 3D environment information according to the control signal.
  • the size of the area where the 2D environment information is displayed is smaller than the size of the area where the 3D environment information is displayed.
  • An embodiment of the present invention further provides a driving image display device, the device includes: a parameter acquisition module, configured to acquire 2D display parameters and 3D display parameters when displaying 2D environment information and 3D environment information synchronously, wherein the 2D display parameters
  • the display parameter is a parameter when displaying 2D environment information
  • the 3D display parameter is a parameter when displaying 3D environment information
  • the CPU operation information acquisition module is used to acquire the operation information of the CPU, and the CPU includes a plurality of cores; the core adjustment is performed.
  • the target thread is a thread for processing 2D environment information and/or 3D environment information.
  • An embodiment of the present invention further provides a storage medium, on which a computer program is stored, and when the computer program is run by a processor, the steps of any one of the methods are executed.
  • An embodiment of the present invention further provides an embedded device, the CPU of the embedded device is multi-core, the embedded device includes the driving image display device, or the embedded device may include a memory and a processor, so A computer program executable on the processor is stored in the memory, and the processor executes the steps of any one of the methods when the processor runs the computer program.
  • An embodiment of the present invention further provides a driving image display platform, the platform includes a plurality of cameras for collecting environmental information of a vehicle and acquiring captured images, an embedded device, and a vehicle-mounted display for displaying 2D environmental information and/or 3D environmental information terminal.
  • An embodiment of the present invention provides a driving image display method, comprising: acquiring 2D display parameters and 3D display parameters when displaying 2D environment information and 3D environment information synchronously, wherein the 2D display parameters are the values obtained when displaying 2D environment information.
  • parameters, the 3D display parameters are parameters when displaying 3D environment information; obtain operation information of the CPU, the CPU includes multiple cores; according to the 2D display parameters, the 3D display parameters and/or the operation of the CPU The information adjusts the frequency of the CPU and/or changes the execution core, where the execution core is a CPU core running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information.
  • the present invention provides a solution for displaying 2D and 3D images at the same time.
  • the implementation of the 2D algorithm and the 3D algorithm can be monitored in real time.
  • the execution core is adjusted to ensure hardware support, so as to ensure the frame rate and real-time performance of 2D/3D display, and solve the problem of existing panoramic driving images.
  • the display frame rate is reduced and the display delay is the problem.
  • the captured image acquisition thread, the 2D algorithm processing thread and the 3D algorithm processing thread are divided into three independent threads, so as to avoid mutual interference between the threads and improve the processing efficiency.
  • the 2D/3D algorithm processing thread at least 5 pointers are defined in the Buffer method.
  • the 2D/3D algorithm processing thread that can run at the same time can read the data in the Buffer at the same time through two sets of data read pointers (one set is the first pointer and the second pointer, and the other group is the third pointer and the fourth pointer). .
  • the multi-pointer pointing method is used in the Buffer data, so that the data of the Buffer can be obtained concurrently in the multi-threaded access, and the pointer copy transmission is used in the process of data parsing, packaging and transmission, reducing the The delay caused by the actual copy of the data. It will decouple the acquisition of Camera data and the processing of 2D/3D algorithms, and use multi-thread acceleration and zero-copy technology to reduce data and algorithm processing delays.
  • FIG. 1 is a schematic flowchart of a method for displaying driving images according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a scheduling policy according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of three threads provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of the execution steps of a specific embodiment of thread 1 in FIG. 3;
  • FIG. 5 is a schematic diagram of the pointing positions of five pointers defined in an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of a driving image display device according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a driving image display platform according to an embodiment of the present invention.
  • the output frame rate of the commonly used embedded video capture devices on the market is 30fps, and each fps takes 33ms. If there is a delay in 2D/3D display fps, the relationship between the delay frame rate and time can be expressed by formula (1). :
  • n is the delay frame rate
  • T is the total delay time
  • the video capture device outputs images at a speed of 30fps, but the output frame rate of 2D/3D display is 25fps, this causes a delay of 5fps.
  • the distance offset of the delay using the above method is: 0.46m. It can be seen that the faster the speed, the greater the offset.
  • the reason for this problem is that embedded devices are affected by performance and power consumption, and cannot achieve full power and long-term operation on the vehicle platform. Therefore, running 2D/3D algorithms without optimization on the embedded platform will cause the frame rate to reach less than 30fps. If 2D/3D is required to meet the real-time frame rate requirements, the 2D/3D algorithm needs to be processed within the interval of 33ms between each frame.
  • an embodiment of the present invention provides a driving image display method, including: acquiring 2D display parameters and 3D display parameters when displaying 2D environment information and 3D environment information synchronously, wherein the 2D display parameters are display parameters. Parameters when 2D environment information is displayed, and the 3D display parameters are parameters when displaying 3D environment information; obtain operation information of the CPU, where the CPU includes multiple cores; according to the 2D display parameters, the 3D display parameters and/or The running information of the CPU adjusts an execution core, and the execution core is a CPU core running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information.
  • This solution can solve the problems of reduced display frame rate and display delay when the panoramic driving image system displays 2D and 3D images at the same time, so as to improve driving safety.
  • FIG. 1 is a driving image display method according to an embodiment of the present invention. The method includes the following steps:
  • Step S101 when 2D environment information and 3D environment information are displayed synchronously, acquire 2D display parameters and 3D display parameters, wherein the 2D display parameters are parameters when displaying 2D environment information, and the 3D display parameters are displaying 3D environment information time parameters;
  • the 2D environment information is an image of the surrounding environment information of the vehicle displayed on a display device (such as a screen, etc.).
  • the 3D environment information is an image of the surrounding environment information of the vehicle displayed on the display device.
  • the 2D/3D display parameter is the attribute parameter used to reflect the execution of the algorithm when the 2D/3D algorithm is executed, for example, the operation of the thread executing the 2D/3D algorithm, etc.; or the attribute parameter of the 2D/3D environment information to be displayed. , such as the display frame rate of 2D/3D environment information, etc.
  • the display frame rate is the number of frames per second (Frames Per Second, referred to as fps), which is the definition in the image field, which refers to the number of frames per second transmitted by the picture, and generally refers to the number of pictures of animation or video.
  • fps is a measure of the amount of information used to save and display motion video. The more frames per second, the smoother the video motion displayed.
  • the 2D display parameters include a display frame rate for displaying 2D environment information
  • the 3D display parameters include a display frame rate for displaying 3D environment information.
  • Step S102 obtaining operation information of the CPU, where the CPU includes multiple cores;
  • the central processing unit (Central Processing Unit, referred to as CPU), as the computing and control core of the computer system, is the final execution unit for information processing and program running.
  • the CPU may include a multi-core structure of a symmetric multiprocessor (Symmetric Multiprocessing, SMP for short) or a heterogeneous multiprocessor (Heterogeneous Multiprocessing, HMP for short).
  • SMP Symmetric Multiprocessing
  • HMP Heterogeneous Multiprocessing
  • HMP can realize the dynamic configuration of different types of CPU, Graphics Processing Unit (GPU) and other processing engines in the system to ensure the most appropriate task allocation, optimal performance and lowest power consumption in actual work.
  • GPU Graphics Processing Unit
  • SoC System-on-a-Chip
  • the running information of the CPU is related information used to represent the state of the CPU during running, such as the occupancy (or occupancy rate) of the CPU, the frequency of the CPU, the temperature of the CPU, and the like.
  • the operating information of the CPU includes the temperature of the CPU and/or the frequency of the CPU.
  • Step S103 according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, adjust the frequency of the CPU and/or replace the execution core, where the execution core is the CPU core running the target thread, so
  • the target thread is a thread for processing 2D environment information and/or 3D environment information.
  • the execution of the 2D algorithm and the 3D algorithm is monitored according to the 2D display parameters and the 3D display parameters, and the operation of the hardware device implementing the method is monitored through the operation information of the CPU.
  • the execution core to the CPU cannot support the simultaneous real-time display of 2D and 3D environment information, the frequency of the CPU can be adjusted and/or the execution core can be replaced.
  • the CPU core includes at least one large core and several small cores, and the computing power of the large core is stronger than that of the small core.
  • the existing multi-core CPU may include 2 large cores and 6 small cores, or 4 large cores. nuclei and 4 small nuclei, etc. Adjusting the execution cores may also include shifting target threads from running on large cores to running on small cores, or from running on small cores to running on large cores.
  • Step S103 sets a CPU scheduling strategy, with 8 CPU cores (2 large cores (core 1 and core 2 of the large core in FIG. 2 ) and 6 small cores (core 1 of the small core in FIG. 2 ) , ..., core 6)) as an example, the schematic diagram of the scheduling strategy can be seen in Figure 2.
  • the CPU frequency modulation module CPU frequency modulation shown in Figure 2
  • the scheduling strategy is essentially allocating computing resources of the CPU on demand.
  • the CPU frequency modulation operation is integrated into the scheduling strategy. Since the running information of the CPU and the 2D/3D display parameters change, the frequency of the CPU managed by the strategy needs to be updated in real time.
  • the CPU frequency modulation module can also be used to replace the target thread to run on another CPU core. Since the scheduling policy manages multiple CPU cores, it is possible to schedule computing resources to run on different cores by identifying the performance and power consumption differences between CPU cores under different architectures.
  • the CPU frequency modulation module can determine dynamic voltage and frequency scaling (DVFS), that is, to dynamically adjust the voltage and frequency to balance performance and power consumption.
  • DVFS dynamic voltage and frequency scaling
  • the operating frequency and voltage of the chip are dynamically adjusted according to the different demands of the applications running on the chip on the computing power of the CPU. For the same chip, the higher the frequency, the higher the required voltage.
  • the size of the area where the 2D environment information is displayed is smaller than the size of the area where the 3D environment information is displayed.
  • the 2D environment information and the 3D environment information may be displayed on the same screen, or may be displayed on different screens.
  • the size of the area displaying 2D environment information accounts for one third of the total size of the screen
  • the size of the area displaying 3D environment information accounts for two thirds of the total size of the screen.
  • the embodiment of FIG. 1 provides a solution for displaying 2D and 3D images at the same time.
  • the execution of the 2D algorithm and the 3D algorithm and the CPU can be monitored in real time.
  • the execution core is adjusted to ensure hardware support, thereby ensuring the frame rate and real-time performance of 2D/3D display, and solving the simultaneous real-time display of the existing panoramic driving image system.
  • the display frame rate decreases and the display lags.
  • the CPU core of the hardware device for executing the method includes at least one large core and several small cores
  • the operation of adjusting the execution core in step S103 may be: if the set CPU frequency value needs to be adjusted to the maximum frequency value, and the temperature of the CPU increases within a first preset time, and the execution core is changed to a large core.
  • the frequency value of the CPU that needs to be set is a frequency value that can meet the 2D/3D display requirements.
  • the maximum frequency is the set CPU threshold. When the CPU frequency is adjusted to a threshold, it will switch between large and small cores based on the temperature of the CPU, and use the larger core with stronger computing power as the execution core to ensure synchronous display.
  • the method further includes: if the temperature of the CPU drops within a second preset time, changing the execution core to a small core.
  • the first preset time may be the same as or different from the second preset time, and the two periods of time are the time for monitoring the operation of the CPU, which can be set as required.
  • the following formula (3) and formula (4) may be referred to for the CPU frequency linear adjustment formula and the large and small core switching judgment in the scheduling policy.
  • freq new is the CPU frequency to be set
  • freq max is the maximum frequency
  • K is the excess factor
  • load is the current load of the CPU
  • load max is the maximum load of the CPU
  • P is the compensation factor
  • CPU temp is the CPU temperature
  • fps 2D/3D is the display frame rate of 2D/3D.
  • the method further includes: controlling a CPU core not executing a task to enter a light sleep mode.
  • a CPU idle (Idle) (as shown in FIG. 2 ) module is used to control the CPU core executing the task to enter the sleep mode.
  • the sleep mode of the CPU core can include deep sleep mode and light (degree) sleep mode.
  • the CPU core power consumption of the deep sleep mode is the lowest, the delay time of wake-up is also significantly longer. Therefore, in order to reduce the wake-up delay, in the scheduling strategy Only allows the CPU to be in light sleep mode, and prohibits entering deep sleep mode.
  • the method before adjusting the execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, the method further includes: acquiring a 2D image from an application layer (the Application layer in FIG. 2 ).
  • Display parameters and 3D display parameters; the 2D/3D display parameters may be a display frame rate for displaying 2D/3D environment information (abbreviated as display frame rate in FIG. 2 ).
  • the temperature of the CPU and the frequency of the CPU are obtained from the CPU kernel (Kernel).
  • the target thread in FIG. 1 includes at least three threads in FIG. 3 : a captured image acquisition thread (as shown in thread 1 in FIG. 3 ), a 2D algorithm processing thread (as shown in thread 2 in FIG. 3 ), and a 3D algorithm Processing threads (thread 3 in Figure 3).
  • the captured image acquisition thread is used to acquire data captured by the cameras from one or more cameras (Cameras) capturing the surrounding environment of the vehicle, and perform data analysis and data packaging (shown in FIG. 3 ).
  • the plurality of cameras include cameras that capture four directions of front, rear, left, and right of the vehicle.
  • the data analysis is to decompose the data of multiple cameras according to the requirements. If the cameras are installed in the four directions of the front, rear, left and right of the vehicle, the obtained set of data needs to be parsed into four independent data of front, rear, left and right. Group. Further, the data collected by the Camera is in YUV format, and the four independent data of front, rear, left and right can be parsed into data groups of Y component and UV component again. Data encapsulation is to express the separated data by means of buffer (Buffer).
  • Buffer buffer
  • the 2D algorithm processing thread and the 3D algorithm processing thread read the data stored in the Buffer at the same time.
  • the 2D algorithm processing thread is used to process the acquired data to obtain 2D environment information that can be displayed, and transmit the obtained 2D environment information to data. for display device display.
  • the 3D algorithm processing thread is used to process the acquired data to obtain displayable 3D environment information, and to transmit the obtained 2D environment information for display by the display device.
  • the captured image acquisition thread, the 2D algorithm processing thread, and the 3D algorithm processing thread are divided into three independent threads, so as to avoid mutual interference between the threads and improve processing efficiency.
  • the execution steps of thread 1 in FIG. 3 include:
  • Step S401 acquiring captured images from several cameras of the vehicle through the captured image acquisition, and storing the captured captured images in a buffer area, where the captured images include 2D images and 3D images;
  • Step S402 intercepting at least one frame of 2D image from the buffer area each time and sending it to the 2D algorithm processing thread, and the 2D algorithm processing thread processes the 2D image to obtain displayable 2D environment information;
  • Step S403 at least one frame of 3D image is intercepted from the buffer area each time and sent to the 3D algorithm processing thread, and the 3D algorithm processing thread processes the 3D image to obtain displayable 3D environment information.
  • step S402 and step S403 is not limited, and may be performed simultaneously.
  • the method further includes: defining at least 5 pointers, the pointers include a first pointer, a second pointer, a third pointer, a fourth pointer and a fifth pointer, and the pointing positions of the 5 pointers are as follows: As shown in FIG. 5; in step S402, each time at least one frame of 2D image is intercepted from the buffer area, the first pointer and the second pointer respectively point to the start position (start position 1 in FIG. 5) and stop of this interception. position (that is, the stop position 1 in FIG. 5 ); in step S403, each time at least one frame of 3D image is captured from the buffer area, the third pointer and the fourth pointer respectively point to the start position of the current capture (in FIG. 5 ).
  • start position 2 start position 2 and stop position (ie, stop position 2 in Figure 5);
  • the fifth pointer points to the maximum capacity position of the cache area (marked in Figure 5), the first pointer, the second pointer The positions pointed to by , the third pointer and the fourth pointer cannot exceed the position pointed to by the fifth pointer.
  • the 2D/3D algorithm processing threads that can run at the same time can read the Buffer simultaneously through two sets of data read pointers (one set is the first pointer and the second pointer, and the other group is the third pointer and the fourth pointer). data in .
  • Zero-Copy technology is used in at least one of the following operations: acquiring a captured image, capturing a 2D image, and capturing a 3D image.
  • step S402 and step S403 each time at least one frame of 2D/3D image is intercepted from the buffer area and sent to the 2D/3D algorithm processing thread, the pointer of the intercepted 2D/3D image (the first The pointer and the second pointer, or the third pointer and the fourth pointer) are passed to the 2D/3D processing algorithm, and passing the pointed data value through the pointer can reduce the delay caused by the actual copying of a large amount of data, and realize the zero-copy technology.
  • the pointer of the intercepted 2D/3D image the first The pointer and the second pointer, or the third pointer and the fourth pointer
  • the parsed data group is expressed by the Buffer method described in the zero-copy technique, and then the encapsulated data is passed to the threads processed by the 2D and 3D algorithms respectively. 2 and thread 3.
  • Data transmission between threads can use zero-copy technology, and multi-thread acceleration is used to decouple the operation of captured image acquisition and 2D/3D processing algorithms.
  • the existing Camera data that is, the data corresponding to the captured image
  • the Buffer method There is only one set of data read pointers. If there are multiple threads reading the data in the Buffer at the same time, concurrent processing cannot be performed.
  • a real-time display method for panoramic driving images based on embedded devices it should be noted that if the zero-copy technology implemented by multiple data reading pointers is adopted based on the content of the present invention, the essential content of the present invention will not be affected, and both It is the protection scope of the present invention.
  • the zero-copy technology is used, and the multi-pointer pointing method is used in the Buffer data, so that the data of the Camera Buffer can be obtained concurrently in the multi-threaded access, and the pointer copy transmission is used in the process of data analysis, packaging and transmission, reducing the actual data.
  • Delay caused by copying It will decouple the acquisition of Camera data and the processing of 2D/3D algorithms, and use multi-thread acceleration and zero-copy technology to reduce data and algorithm processing delays.
  • the method further includes: receiving control information sent by the vehicle-mounted central control and/or the gear controller; and synchronously displaying the 2D environment information and the 3D environment information according to the control signal.
  • the control information is sent through the on-board central control and/or the gear controller.
  • UI interface for information. For example, turn on 2D and 3D simultaneous display in reverse gear, and only turn on 2D display if the vehicle is moving normally.
  • an embodiment of the present invention further provides a driving image display device 60, including:
  • the parameter acquisition module 601 is used to acquire 2D display parameters and 3D display parameters when displaying 2D environmental information and 3D environmental information synchronously, wherein the 2D display parameters are parameters when displaying 2D environmental information, and the 3D display parameters are Parameters when displaying 3D environment information;
  • a CPU operation information acquisition module 602 configured to acquire operation information of a CPU, where the CPU includes multiple cores;
  • An execution core adjustment module 603 configured to adjust the frequency of the CPU and/or change the execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, where the execution core is the running target thread
  • the target thread is a thread for processing 2D environment information and/or 3D environment information.
  • the above-mentioned driving image display device 60 may correspond to a chip with a display function in an embedded device, or a chip with a data processing function, such as a system-on-a-chip (System-On-a-Chip, SOC for short) ), baseband chip, etc.; or corresponding to a chip module including a display function chip in an embedded device; or corresponding to a chip module having a data processing function chip, or corresponding to an embedded device.
  • a chip with a display function in an embedded device or a chip with a data processing function, such as a system-on-a-chip (System-On-a-Chip, SOC for short) ), baseband chip, etc.
  • a chip module including a display function chip in an embedded device or corresponding to a chip module having a data processing function chip, or corresponding to an embedded device.
  • each module/unit included in each device and product described in the above embodiments it may be a software module/unit, a hardware module/unit, or a part of a software module/unit, a part of which is a software module/unit. is a hardware module/unit.
  • each module/unit included therein may be implemented by hardware such as circuits, or at least some of the modules/units may be implemented by a software program.
  • Running on the processor integrated inside the chip the remaining (if any) part of the modules/units can be implemented by hardware such as circuits; for each device and product applied to or integrated in the chip module, the modules/units contained therein can be They are all implemented by hardware such as circuits, and different modules/units can be located in the same component of the chip module (such as chips, circuit modules, etc.) or in different components, or at least some of the modules/units can be implemented by software programs.
  • the software program runs on the processor integrated inside the chip module, and the remaining (if any) part of the modules/units can be implemented by hardware such as circuits; for each device and product applied to or integrated in the terminal, each module contained in it
  • the units/units may all be implemented in hardware such as circuits, and different modules/units may be located in the same component (eg, chip, circuit module, etc.) or in different components in the terminal, or at least some of the modules/units may be implemented by software programs Realization, the software program runs on the processor integrated inside the terminal, and the remaining (if any) part of the modules/units can be implemented in hardware such as circuits.
  • An embodiment of the present invention further provides a storage medium on which a computer program is stored, and when the computer program is run by a processor, the steps of the methods described in FIG. 1 to FIG. 5 are executed.
  • the storage medium may be a computer-readable storage medium, for example, may include non-volatile memory (non-volatile) or non-transitory (non-transitory) memory, and may also include optical disks, mechanical hard disks, solid-state disks, and the like.
  • the embodiment of the present invention also provides an embedded device.
  • the CPU of the embedded device is multi-core, and the embedded device includes the driving image display device 60 as shown in FIG. A computer program running on the processor, when the processor runs the computer program, the steps of the methods described in FIGS. 1 to 5 are executed.
  • an embodiment of the present invention further provides a driving image display platform 70, the platform includes a plurality of cameras 701 for collecting environmental information of the vehicle and acquiring captured images, an embedded device 703, and a plurality of cameras 703 for displaying 2D environmental information and /or a vehicle-mounted display terminal 704 for 3D environment information.
  • the 2D environment information is displayed in the 2D display area of the vehicle-mounted display terminal 704, and the 3D environment information is displayed in the 3D display area of the vehicle-mounted display terminal 704.
  • the size of the 2D display area is smaller than the size of the 3D display area.
  • 2D and 3D displays are based on different surfaces created by the operating system.
  • the display of 2D and 3D is based on different Surfaces, so there will be differences in the display frame rates of 2D and 3D in different Surfaces.
  • the interval between the output frame rate of an image acquisition device such as a camera is (The output frame rate of the commonly used embedded Camera acquisition device is 30fps, and the interval time is about 33ms) to process 2D and 3D algorithms, so that the 2D/3D display frame rate can be synchronized with the output frame rate of the Camera acquisition device.
  • the driving image display platform 70 is powered by the vehicle power supply 702 .
  • the embedded device can also be powered by the vehicle power supply 702 alone, and the stable power supply of the vehicle power supply 702 also determines the stability of the CPU operation.
  • the above-mentioned methods for displaying panoramic driving images in real time as shown in FIGS. 1 to 5 can be executed in the CPU of an embedded device or a driving image display platform of an automobile, and the CPU includes a multi-core structure of SMP or MPP.
  • the hardware of the embedded device or the driving image display platform 70 may include a CPU based on X86, ARM, MIPIS and Powpc architecture, and the software system for executing the above method may include an operating system based on Android, iOS, Windows and Linux. It should be noted that the real-time display method of panoramic driving images based on embedded devices implemented by other hardware systems and software systems based on the content of the present invention does not affect the essential content of the present invention, and is regarded as the protection scope of the present invention.
  • the processor may be a central processing unit (central processing unit, CPU for short), and the processor may also be other general-purpose processors, digital signal processors (digital signal processor, DSP for short) ), application specific integrated circuit (ASIC), off-the-shelf programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be read-only memory (ROM for short), programmable read-only memory (PROM for short), erasable programmable read-only memory (EPROM for short) , Electrically Erasable Programmable Read-Only Memory (electrically EPROM, EEPROM for short) or flash memory.
  • Volatile memory may be random access memory (RAM), which acts as an external cache.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous Dynamic random access memory
  • SDRAM synchronous Dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM Synchronous connection dynamic random access memory
  • DR RAM direct memory bus random access memory
  • the present invention realizes a driving image display method and device, a platform, a storage medium, and an embedded device, which are realized when the output of an image acquisition device such as a camera is 30fps.
  • the 2D/3D algorithm is processed within 33ms of the interval between each frame to ensure the frame rate and real-time performance of 2D/3D display.
  • the embodiment of the present invention specifically includes the following effects:
  • the 2D/3D display frame rate can be maintained in sync with the output frame rate of the Camera capture device.
  • the delay frame rate will increase. Therefore, in this method, the acquisition of Camera data and the processing of 2D/3D algorithms are decoupled, and multi-thread acceleration and zero-copy technology are used to reduce data and algorithm processing delays.
  • connection in the embodiments of the present application refers to various connection modes such as direct connection or indirect connection, so as to realize communication between devices, which is not limited in the embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种行车影像显示方法、装置及平台、存储介质、嵌入式设备,该方法包括:在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;获取CPU的运行信息,所述CPU包含多个核;根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。通过本实施方案能够解决现有的全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题。

Description

行车影像显示方法、装置及平台、存储介质、嵌入式设备
本申请要求2020年11月30日提交中国专利局、申请号为202011373475.7、发明名称为“行车影像显示方法、装置及平台、存储介质、嵌入式设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及视频显示技术领域,尤其涉及一种行车影像显示方法、装置及平台、存储介质、嵌入式设备。
背景技术
随着图像、视觉技术的快速发展,越来越多的相关技术被应用到车载电子领域,传统行车影像***利用安装在车尾的单路摄像头,仅能覆盖车尾周围视角有限的区域,无法环视车辆周围的信息,这将大大增加驾驶员的行车安全隐患。
目前新型的全景行车影像***使用多路摄像头感知车辆周围环境信息,通过360度(或记作“360°”)的平面二维(2D)模式或360°的三维(3D)立体模式得到车辆行驶信息。市场上常见的相机(Camera)采集设备的输出帧率为30帧每秒(Frames Per Second,简称fps),帧间隔约为33毫秒(ms),输出格式为YUV。其中,YUV,是一种颜色编码方法,“Y”表示明亮度(Luminance或Luma),也就是灰阶值,“U”和“V”表示的则是色度。然而,目前新型的全景行车影像***只能通过360°的平面2D模式,无法感知车辆360°的3D环境信息,或者是只能显示360°的3D模式,无法与2D平面模式兼容。
如果要同时显示2D和3D画面,嵌入式设备受到性能和功耗的影响,无法在车机平台上实现全功率、长时间运行,所以在嵌入式平 台上无优化的直接运行2D/3D算法会造成帧率达不到30fps的情况。此时,同时显示2D和3D画面则会造成2D/3D的显示帧率降低,造成显示延迟,显示的实时性无法得到满足。在驾驶员处理车辆起步、行车转弯、泊车入位、窄道会车、规避障碍等情况时,延迟可能造成的距离偏移,继而会给驾驶员的判断造成影响,危害行车安全。
发明内容
本发明解决的技术问题是如何解决现有的全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题。
为解决上述问题,本发明实施例提供了一种行车影像显示方法,包括:在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;获取CPU的运行信息,所述CPU包含多个核;根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
可选的,所述CPU的运行信息包括所述CPU的温度和/或所述CPU的频率。
可选的,所述2D显示参数包括显示2D环境信息的显示帧率,所述3D显示参数包括显示3D环境信息的显示帧率。
可选的,所述CPU核包括至少一个大核和若干个小核,调整执行核包括:如果需要设定的CPU频率值调整到频率最大值、且所述CPU的温度在第一预设时间内增加,将所述执行核变更为大核。
可选的,所述将所述执行核变更为大核之后,还包括:如果所述CPU的温度在第二预设时间内下降,将所述执行核变更为小核。
可选的,根据下述公式调整需要设定的CPU频率值:
Figure PCTCN2021127903-appb-000001
所述P的取值满足公式:
Figure PCTCN2021127903-appb-000002
其中,freq new为需要设定的CPU频率,freq max为所述频率最大值,K为过度因子,load为CPU的当前负载,load max为CPU的最大负载,P为补偿因子,CPU temp为CPU的温度,fps 2D/3D为2D/3D的显示帧率。
可选的,所述方法还包括:控制无执行任务的CPU核进入浅睡眠模式。
可选的,所述根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整执行核之前,还包括:从应用层获取2D显示参数和3D显示参数;从CPU内核中获取所述CPU的温度和所述CPU的频率。
可选的,所述目标线程至少包括:拍摄图像获取线程、2D算法处理线程和3D算法处理线程。
可选的,所述方法还包括:通过所述拍摄图像获取从车辆的若干个相机获取拍摄图像,并将获取的拍摄图像存储到缓存区,所述拍摄图像包括2D图像和3D图像;每次从所述缓存区截取至少一帧2D图像发送至所述2D算法处理线程,并由所述2D算法处理线程对所述2D图像进行处理,以得到可显示的2D环境信息;每次从所述缓存区截取至少一帧3D图像发送至所述3D算法处理线程,并由所述3D算法处理线程对所述3D图像进行处理,以得到可显示的3D环境信息。
可选的,所述方法还包括:定义至少5个指针,所述指针包括第一指针、第二指针、第三指针、第四指针和第五指针;在每次从所述缓存区截取至少一帧2D图像时,第一指针和第二指针分别指向本次 截取的开始位置和停止位置;在每次从所述缓存区截取至少一帧3D图像时,第三指针和第四指针分别指向本次截取的开始位置和停止位置;所述第五指针指向所述缓存区的最大容量位置,所述第一指针、第二指针、第三指针和第四指针指向的位置均不能超出第五指针指向的位置。
可选的,至少在下述其中一个操作中使用Zero-Copy技术:获取拍摄图像、截取2D图像、截取3D图像。
可选的,所述方法还包括:接收车载中控和/或档位控制器发出的控制信息;根据所述控制信号同步显示2D环境信息和3D环境信息。
可选的,同步显示2D环境信息和3D环境信息时,显示2D环境信息的区域尺寸小于显示3D环境信息的区域尺寸。
本发明实施例还提供一种行车影像显示装置,所述装置包括:参数获取模块,用于在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;CPU运行信息获取模块,用于获取CPU的运行信息,所述CPU包含多个核;执行核调整模块,用于根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
本发明实施例还提供一种存储介质,其上存储有计算机程序,所述计算机程序被处理器运行时执行任一项所述方法的步骤。
本发明实施例还提供一种嵌入式设备,所述嵌入式设备的CPU为多核,所述嵌入式设备包括所述行车影像显示装置,或者,所述嵌入式设备可以包括存储器和处理器,所述存储器上存储有可在所述处理器上运行的计算机程序,所述处理器运行所述计算机程序时执行任 一项所述方法的步骤。
本发明实施例还提供一种行车影像显示平台,所述平台包括若干个采集车辆的环境信息并获取拍摄图像的相机、嵌入式设备和用于显示2D环境信息和/或3D环境信息的车载显示终端。
与现有技术相比,本发明实施例的技术方案具有以下有益效果:
本发明实施例提供了一种行车影像显示方法,包括:在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;获取CPU的运行信息,所述CPU包含多个核;根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。较之现有技术,本发明提供了同时显示2D和3D画面的方案,为避免嵌入式设备上直接运行2D/3D算法造成显示帧率降低的情况,可实时监控2D算法和3D算法的执行情况和CPU的运行信息,在无法支持2D和3D环境信息的同步实时显示时,调整执行核,以保证硬件支持,从而保证2D/3D显示的帧率和实时性,解决了现有的全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题。
进一步地,在调度策略中只允许CPU处于浅睡眠模式,禁止进入深度睡眠模式,以减少唤醒延迟。
进一步地,将拍摄图像获取线程、2D算法处理线程和3D算法处理线程分为三个独立的线程,避免各个线程相互干扰,提高处理效率。
进一步地,针对2D/3D算法处理线程,Buffer方法中定义至少5个指针。能够同时运行的2D/3D算法处理线程就能够通过两组数据读取指针(一组为第一指针和第二指针,另一组为第三指针和第四指 针)同时读取Buffer中的数据。
进一步地,同时使用zero-copy技术,在Buffer数据中使用多指针指向方法,使得多线程访问中能够并发的获得Buffer的数据,并在数据解析、封装和传递的过程中使用指针拷贝传递,减少数据实拷贝造成的延迟。并将获取Camera数据和处理2D/3D算法进行解耦,使用多线程加速和zero-copy技术来减少数据和算法处理延迟。
附图说明
图1为本发明实施例的一种行车影像显示方法的流程示意图;
图2为本发明实施例的一种调度策略的示意图;
图3为本发明实施例提供的3个线程的示意图;
图4为图3中线程1的一个具体实施例的执行步骤示意图;
图5为本发明实施例的定义的5个指针的指向位置的示意图;
图6为本发明实施例的一种行车影像显示装置的结构示意图;
图7为本发明实施例的一种行车影像显示平台的结构示意图。
具体实施方式
如背景技术所言,现有技术中存在全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题。
具体地,目前市场上常用的嵌入式视频采集设备输出帧率为30fps,每fps需要33ms时间,如果2D/3D显示fps存在延迟,则延迟帧率与时间的关系为可以采用公式(1)表示:
T=0.033×n   (1);
其中,n为延迟帧率,T为延迟总时间。
如果驾驶员处理车辆起步、行车转弯、泊车入位、窄道会车、规避障碍等情况时,以10千米每小时(km/h)的速度运行,则由于延 迟造成的距离偏移可以用公式(2)表示:
S=T×2.8,单位:米(m)   (2);
其中,S为延迟偏移距离。
如果视频采集设备以30fps的速度输出图像,但是2D/3D显示输出帧率为25fps,这就造成了5fps的延迟,使用上述方法延迟的距离偏移为:0.46m。可以看出,速度越快造成的偏移越大。该问题的原因是嵌入式设备受到性能和功耗的影响,无法在车机平台上实现全功率、长时间运行,所以在嵌入式平台上无优化的直接运行2D/3D算法会造成帧率达不到30fps的情况。如果需要2D/3D满足实时性的帧率要求,需要在每帧之间的间隔时间33ms内处理完2D/3D算法。
基于上述技术问题,本发明实施例提供了一种行车影像显示方法,包括:在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;获取CPU的运行信息,所述CPU包含多个核;根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
通过该方案,能够解决全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题,以提高行车安全。
为使本发明的上述目的、特征和有益效果能够更为明显易懂,下面结合附图对本发明的具体实施例做详细的说明。
请参见图1,图1为本发明实施例的一种行车影像显示方法,该方法包括以下步骤:
步骤S101,在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;
其中,2D环境信息为在显示设备(如屏幕等)上显示的车辆周围环境信息的图像。3D环境信息为在显示设备上显示的车辆周围环境信息的图像。
2D/3D显示参数为执行2D/3D算法时、用于体现算法执行情况的属性参数,例如,执行2D/3D算法的线程的运算情况等;或者输出要显示的2D/3D环境信息的属性参数,如2D/3D环境信息的显示帧率等。其中,显示帧率为每秒传输帧数(Frames Per Second,简称fps),是图像领域中的定义,是指画面每秒传输帧数,通俗来讲就是指动画或视频的画面数。fps是测量用于保存、显示动态视频的信息数量。每秒钟帧数越多,所显示的视频动作就会越流畅。
在一个具体实施例中,所述2D显示参数包括显示2D环境信息的显示帧率,所述3D显示参数包括显示3D环境信息的显示帧率。由此,能够通过2D/3D环境信息的显示帧率以实时监控当前的显示是否可能发生显示延迟。
步骤S102,获取CPU的运行信息,所述CPU包含多个核;
中央处理器(Central Processing Unit,简称CPU)作为计算机***的运算和控制核心,是信息处理、程序运行的最终执行单元。可选的,CPU可以包括对称多处理器(Symmetric Multiprocessing,简称SMP)或者异构多处理器(Heterogeneous Multiprocessing,简称HMP)的多核结构。其中,在SMP中所有的处理器都是对等的,它们通过总线连接共享同一块物理内存,这也就导致了***中所有资源(CPU、内存、输入输出接口(I/O)等)都是共享的。HMP可以在***内实现不同类型CPU、图形处理器(Graphics Processing Unit,简称GPU)和其他处理引擎的动态配置,在实际工作中确保最合适的任务分配、最优的性能、最低的功耗。借助多处理器技术,使得***级芯片(System-on-a-Chip,简称SoC)的复杂计算能力可以得到充分释放。
CPU的运行信息为用于表示CPU在运行时的状态的相关信息,如CPU的占用量(或占用率)、CPU的频率、CPU的温度等等。在 一个具体实施例中,所述CPU的运行信息包括所述CPU的温度和/或所述CPU的频率。
步骤S103,根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或更换执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
在在同步显示2D环境信息和3D环境信息时,根据2D显示参数、3D显示参数监控2D算法和3D算法的执行情况,通过CPU的运行信息监控实现所述方法的硬件设备的运行情况,当检测到CPU的执行核无法支持2D和3D环境信息的同步实时显示时,可调整CPU的频率和/或更换执行核。
可选的,所述CPU核包括至少一个大核和若干个小核,大核的计算能力比小核强,现有的多核CPU可以包括2个大核和6个小核,或者4个大核和4个小核等。调整执行核也可以包括将目标线程从大核转到小核上运行,或者由小核转到大核上运行。
步骤S103设置了一种CPU的调度策略,以8个CPU核(2个大核(图2中的大核的核1和核2)和6个小核(图2中的小核的核1,…,核6))为例,该调度策略的示意图可参见图2通过CPU调频模块(如图2所示的CPU调频),调度策略本质上是对CPU的计算资源按需分配。将CPU调频操作集成在调度策略中,由于CPU的运行信息和2D/3D显示参数发生变化,需要实时更新该策略所管理CPU的频率。
可选的,CPU调频模块还可以用于将目标线程更换到其他CPU核上运行。由于调度策略管理了多个CPU核,可通过识别不同架构下CPU核之间的性能和功耗差异,来调度计算资源在不同的核心上运行。
可调度策略中,CPU调频模块可以决定动态电压频率调整 (Dynamic voltage and frequency scaling,简称DVFS),即动态的去调整电压和频率来平衡性能和功耗。具体地,根据芯片所运行的应用程序对CPU计算能力的不同需要,动态调节芯片的运行频率和电压,其中,对于同一芯片,频率越高,需要的电压也越高。
另外,同步显示2D环境信息和3D环境信息时,显示2D环境信息的区域尺寸小于显示3D环境信息的区域尺寸。
可选的,2D环境信息与3D环境信息可以在同一屏幕上显示,也可以在不同的屏幕上显示。可选的,若在同一屏幕上显示时,显示2D环境信息的区域尺寸占该屏幕总尺寸的三分之一,显示3D环境信息的区域尺寸占该屏幕总尺寸的三分之二。
图1的实施例提供一种同时显示2D和3D画面的方案,为避免嵌入式设备上直接运行2D/3D算法造成显示帧率降低的情况,可实时监控2D算法和3D算法的执行情况和CPU的运行信息,在无法支持2D和3D环境信息的同步实时显示时,调整执行核,以保证硬件支持,从而保证2D/3D显示的帧率和实时性,解决了现有的全景行车影像***同时显示2D和3D画面时的显示帧率降低、显示延迟的问题。
在一个实施例中,执行所述方法的硬件设备的CPU核包括至少一个大核和若干个小核,步骤S103中调整执行核的操作可以为:如果需要设定的CPU频率值调整到频率最大值、且所述CPU的温度在第一预设时间内增加,将所述执行核变更为大核。
其中,需要设定的CPU的频率值为能够满足2D/3D显示需求的频率值。频率最大值为设定的CPU的阈值,当CPU频率调整到一个阈值,将结合CPU的温度进行大小核切换,以计算能力更强的大核作为执行核,保证同步显示。
可选的,所述将所述执行核变更为大核之后,还包括:如果所述CPU的温度在第二预设时间内下降,将所述执行核变更为小核。
当CPU的温度在第二预设时间内下降,结合调度策略则将2D/3D算法又调度到小核心上运行。其中,第一预设时间可以与第二预设时间相同,也可以不同,该两段时间为监控CPU运行的时间,可根据需要设定。
在一个实施例中,调度策略中的CPU频率线性调整公式和大小核切换判断可参照下述公式(3)和公式(4)。
Figure PCTCN2021127903-appb-000003
所述P的取值满足公式:
Figure PCTCN2021127903-appb-000004
其中,freq new为需要设定的CPU频率,freq max为所述频率最大值,K为过度因子,load为CPU的当前负载,load max为CPU的最大负载,P为补偿因子,CPU temp为CPU的温度,fps 2D/3D为2D/3D的显示帧率。
在一个实施例中,请参见图1和图2,所述的方法还包括:控制无执行任务的CPU核进入浅睡眠模式。
可选的,在CPU的调度策略中通过CPU空闲(Idle)(如图2所示)模块来控制执行任务的CPU核进入睡眠模式。
其中,CPU核的睡眠模式可包括深度睡眠模式和浅(度)睡眠模式,虽然深度睡眠模式的CPU核心功耗最低,但是唤醒的延迟时间也明显较长,所以为了减少唤醒延迟,在调度策略中只允许CPU处于浅睡眠模式,禁止进入深度睡眠模式。
在一个实施例中,所述根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整执行核之前,还包括:从应用层(图2中的Application层)获取2D显示参数和3D显示参数;该2D/3D 显示参数可以为显示2D/3D环境信息的显示帧率(在图2中简记作显示帧率)。从CPU内核(Kernel)中获取所述CPU的温度和所述CPU的频率。
在一个实施例中,图1中的目标线程至少包括图3中的3个线程:拍摄图像获取线程(如图3中线程1)、2D算法处理线程(如图3中线程2)和3D算法处理线程(如图3中线程3)。
其中,拍摄图像获取线程用于从拍摄车辆周围环境的一个或多个相机(Camera)获取相机拍摄的数据,并进行数据解析和数据封装(图3中已示出)。通常,多个相机包括采集车辆的前、后、左、右四个方向的相机。进一步,数据解析是为了将多个Camera的数据按照需求进行分解,如果Camera安装在车辆的前后左右四个方向,则需要把获得的一组数据解析为前、后、左、右四个独立数据组。进一步,Camera采集的数据为YUV格式,可以将前、后、左、右四路独立数据再次解析为Y分量和UV分量的数据组。数据封装是为了将分离后的数据分别以缓存(Buffer)的方法进行表达。
2D算法处理线程和3D算法处理线程同时读取Buffer中存储的数据,2D算法处理线程用于将获取的数据进行处理以得到可显示的2D环境信息,并将得到的2D环境信息进行数据传递以供显示设备显示。3D算法处理线程用于将获取的数据进行处理以得到可显示的3D环境信息,并将得到的2D环境信息进行数据传递以供显示设备显示。
本实施例中,将拍摄图像获取线程、2D算法处理线程和3D算法处理线程分为三个独立的线程,避免各个线程相互干扰,提高处理效率。
在一个实施例中,请参见图4,图3中线程1的执行步骤包括:
步骤S401,通过所述拍摄图像获取从车辆的若干个相机获取拍摄图像,并将获取的拍摄图像存储到缓存区,所述拍摄图像包括2D 图像和3D图像;
步骤S402,每次从所述缓存区截取至少一帧2D图像发送至所述2D算法处理线程,并由所述2D算法处理线程对所述2D图像进行处理,以得到可显示的2D环境信息;
步骤S403,每次从所述缓存区截取至少一帧3D图像发送至所述3D算法处理线程,并由所述3D算法处理线程对所述3D图像进行处理,以得到可显示的3D环境信息。
其中,步骤S402和步骤S403的执行顺序不限定,也可同时进行。
在一个实施例中,所述方法还包括:定义至少5个指针,所述指针包括第一指针、第二指针、第三指针、第四指针和第五指针,该5个指针的指向位置如图5所示;步骤S402中在每次从所述缓存区截取至少一帧2D图像时,第一指针和第二指针分别指向本次截取的开始位置(图5中的开始位置1)和停止位置(即图5中的停止位置1);步骤S403中在每次从所述缓存区截取至少一帧3D图像时,第三指针和第四指针分别指向本次截取的开始位置(图5中的开始位置2)和停止位置(即图5中的停止位置2);所述第五指针指向所述缓存区的最大容量位置(图5中已标示),所述第一指针、第二指针、第三指针和第四指针指向的位置均不能超出第五指针指向的位置。
读取2D图像时,只需要移动开始位置1的第一指针和停止位置1的第二指针,即以分片段的形式在Buffer中获得数据位置和长度,第一指针和第二指针的指向不能超过第五指针指向的位置。读取3D图像时,只需要移动开始位置2的第三指针和停止位置2的第四指针,即以分片段的形式在Buffer中获得数据位置和长度,第三指针和第四指针的指向不能超过第五指针指向的位置。
由此,能够同时运行的2D/3D算法处理线程就能够通过两组数据读取指针(一组为第一指针和第二指针,另一组为第三指针和第四指针)同时读取Buffer中的数据。
在一个实施例中,至少在下述其中一个操作中使用Zero-Copy技术:获取拍摄图像、截取2D图像、截取3D图像。
具体地,在步骤S402和步骤S403中,每次从所述缓存区截取至少一帧2D/3D图像发送至所述2D/3D算法处理线程时,可以将截取2D/3D图像的指针(第一指针和第二指针,或第三指针和第四指针)传递到2D/3D处理算法中,通过指针的方式传递指向的数据值可以减少大量数据实拷贝造成的延时,实现zero-copy技术。
可选的,在拍摄图像获取的线程1进行数据封装时,将解析的数据组以zero-copy技术中所述的Buffer方法进行表达,然后将封装的数据分别传递到2D与3D算法处理的线程2和线程3。
各个线程间的数据传输可使用zero-copy技术,多线程加速用于解耦拍摄图像获取和2D/3D处理算法的运行。目前已有的Camera数据(也即拍摄图像对应的数据)存储到Buffer方法中,只存在一组数据读取指针,如果存在多线程同时读取Buffer中的数据则无法进行并发处理。一种基于嵌入式设备的全景行车影像实时显示方法中,需要注意的是,若基于本发明的内容采用多数据读取指针实现的zero-copy技术,并不影响本发明的实质内容,均视为本发明的保护范围。
同时使用zero-copy技术,在Buffer数据中使用多指针指向方法,使得多线程访问中能够并发的获得Camera Buffer的数据,并在数据解析、封装和传递的过程中使用指针拷贝传递,减少数据实拷贝造成的延迟。并将获取Camera数据和处理2D/3D算法进行解耦,使用多线程加速和zero-copy技术来减少数据和算法处理延迟。
在一个实施例中,所述方法还包括:接收车载中控和/或档位控制器发出的控制信息;根据所述控制信号同步显示2D环境信息和3D环境信息。
控制信息通过车载中控和/或挡位控制器发出,使用车载中控的控制按钮或者是倒挡/前进挡位进行开启和停止2D/3D的显示,以及 切换车载显示2D环境信息和3D环境信息的UI界面。例如,在倒挡时开启2D和3D的同步显示,若车辆正常前进,则仅开启2D显示即可。
请参见图6,本发明实施例还提供一种行车影像显示装置60,包括:
参数获取模块601,用于在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;
CPU运行信息获取模块602,用于获取CPU的运行信息,所述CPU包含多个核;
执行核调整模块603,用于根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
关于上述行车影像显示装置60的工作原理、工作方式的更多内容,可以参照上述图1至图5中所述方法的相关描述,这里不再赘述。
在具体实施中,上述的用行车影像显示装置60可以对应于嵌入式设备中具有显示功能的芯片,或者对应于具有数据处理功能的芯片,例如片上***(System-On-a-Chip,简称SOC)、基带芯片等;或者对应于嵌入式设备中包括具有显示功能芯片的芯片模组;或者对应于具有数据处理功能芯片的芯片模组,或者对应于嵌入式设备。
在具体实施中,关于上述实施例中描述的各个装置、产品包含的各个模块/单元,其可以是软件模块/单元,也可以是硬件模块/单元,或者也可以部分是软件模块/单元,部分是硬件模块/单元。
例如,对于应用于或集成于芯片的各个装置、产品,其包含的各个模块/单元可以都采用电路等硬件的方式实现,或者,至少部分模 块/单元可以采用软件程序的方式实现,该软件程序运行于芯片内部集成的处理器,剩余的(如果有)部分模块/单元可以采用电路等硬件方式实现;对于应用于或集成于芯片模组的各个装置、产品,其包含的各个模块/单元可以都采用电路等硬件的方式实现,不同的模块/单元可以位于芯片模组的同一组件(例如芯片、电路模块等)或者不同组件中,或者,至少部分模块/单元可以采用软件程序的方式实现,该软件程序运行于芯片模组内部集成的处理器,剩余的(如果有)部分模块/单元可以采用电路等硬件方式实现;对于应用于或集成于终端的各个装置、产品,其包含的各个模块/单元可以都采用电路等硬件的方式实现,不同的模块/单元可以位于终端内同一组件(例如,芯片、电路模块等)或者不同组件中,或者,至少部分模块/单元可以采用软件程序的方式实现,该软件程序运行于终端内部集成的处理器,剩余的(如果有)部分模块/单元可以采用电路等硬件方式实现。
本发明实施例还提供一种存储介质,其上存储有计算机程序,所述计算机程序被处理器运行时执行图1至图5所述方法的步骤。所述存储介质可以是计算机可读存储介质,例如可以包括非挥发性存储器(non-volatile)或者非瞬态(non-transitory)存储器,还可以包括光盘、机械硬盘、固态硬盘等。
本发明实施例还提供一种嵌入式设备。所述嵌入式设备的CPU为多核,所述嵌入式设备包括如图6所述的行车影像显示装置60,或者,所述嵌入式设备可以包括存储器和处理器,所述存储器上存储有可在所述处理器上运行的计算机程序,所述处理器运行所述计算机程序时执行图1至图5所述方法的步骤。
请参见图7,本发明实施例还提供一种行车影像显示平台70,所述平台包括若干个采集车辆的环境信息并获取拍摄图像的相机701、嵌入式设备703和用于显示2D环境信息和/或3D环境信息的车载显示终端704。其中,2D环境信息在车载显示终端704中的2D显示区域中显示,3D环境信息在车载显示终端704中的3D显示区域中显 示。可选的,2D显示区域的尺寸小于3D显示区域的尺寸。2D与3D的显示基于操作***创建的不同表层(Surface)。
2D与3D的显示基于不同的Surface,所以在不同的Surface中2D和3D的显示帧率会存在差异,通过本发明实施例提供的行车影像显示方法,在相机等图像采集设备输出帧率的间隔(常用的嵌入式Camera采集设备输出帧率为30fps,间隔时间约为33ms)处理2D与3D算法,使得2D/3D的显示帧率与Camera采集设备的输出帧率能够同步。
可选的,行车影像显示平台70通过车载电源702供电。且嵌入式设备也可以由车载电源702单独供电,车载电源702的稳定供电也决定了CPU运行的稳定性。上述图1至图5所示全景行车影像实时显示方法可以在嵌入式设备或者汽车的行车影像显示平台的CPU中运行,CPU包括SMP或者MPP的多核结构。
可选的,嵌入式设备或行车影像显示平台70的硬件可以包括基于X86、ARM、MIPIS和Powpc架构的CPU,执行上述方法的软件***可以包括基于Android、iOS、windows和Linux的操作***。需要注意的是,若基于本发明的内容采用其他硬件***和软件***实现的基于嵌入式设备的全景行车影像实时显示方法,并不影响本发明的实质内容,均视为本发明的保护范围。
具体地,在本发明实施例中,所述处理器可以为中央处理单元(central processing unit,简称CPU),该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,简称DSP)、专用集成电路(application specific integrated circuit,简称ASIC)、现成可编程门阵列(field programmable gate array,简称FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失 性存储器可以是只读存储器(read-only memory,简称ROM)、可编程只读存储器(programmable ROM,简称PROM)、可擦除可编程只读存储器(erasable PROM,简称EPROM)、电可擦除可编程只读存储器(electrically EPROM,简称EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,简称RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的随机存取存储器(random access memory,简称RAM)可用,例如静态随机存取存储器(static RAM,简称SRAM)、动态随机存取存储器(DRAM)、同步动态随机存取存储器(synchronous DRAM,简称SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,简称DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,简称ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,简称SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,简称DR RAM)。
综上,本发明实现了一种行车影像显示方法及装置及平台、存储介质、嵌入式设备,实现在相机等图像采集设备输出为30fps的情况下。在每帧之间的间隔时间33ms内处理完2D/3D算法,保证2D/3D显示的帧率和实时性。本发明实施例具体包括以下效果:
1、可以维持2D/3D显示帧率与Camera采集设备的输出帧率同步。
2、利用嵌入式设备多核的特点,在CPU调度策略中融合2D/3D的fps和CPU频率和温度,及时调整2D/3D算法在大核还是小核上运行。
3、使用zero-copy技术,在Buffer数据中使用多指针指向方法,使得多线程访问中能够并发的获得Camera Buffer数据,并在数据解析、封装和传递的过程中使用指针拷贝传递,减少数据实拷贝造成的延迟。
其中,如果将拍摄图像的获取和2D/3D算法进行串行耦合处理,会造成延迟帧率增加。所以该方法中将获取Camera数据和处理 2D/3D算法进行解耦,使用多线程加速和zero-copy技术来减少数据和算法处理延迟。
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,表示前后关联对象是一种“或”的关系。
本申请实施例中出现的“多个”是指两个或两个以上。
本申请实施例中出现的第一、第二等描述,仅作示意与区分描述对象之用,没有次序之分,也不表示本申请实施例中对设备个数的特别限定,不能构成对本申请实施例的任何限制。
本申请实施例中出现的“连接”是指直接连接或者间接连接等各种连接方式,以实现设备间的通信,本申请实施例对此不做任何限定。
虽然本发明披露如上,但本发明并非限定于此。任何本领域技术人员,在不脱离本发明的精神和范围内,均可作各种更动与修改,因此本发明的保护范围应当以权利要求所限定的范围为准。

Claims (18)

  1. 一种行车影像显示方法,其特征在于,所述方法包括:
    在同步显示2D环境信息和3D环境信息时,获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;
    获取CPU的运行信息,所述CPU包含多个核;
    根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
  2. 根据权利要求1所述的方法,其特征在于,所述CPU的运行信息包括所述CPU的温度和/或所述CPU的频率。
  3. 根据权利要求2所述的方法,其特征在于,所述2D显示参数包括显示2D环境信息的显示帧率,所述3D显示参数包括显示3D环境信息的显示帧率。
  4. 根据权利要求3所述的方法,其特征在于,所述CPU核包括至少一个大核和若干个小核,调整执行核包括:
    如果需要设定的CPU频率值调整到频率最大值、且所述CPU的温度在第一预设时间内增加,将所述执行核变更为大核。
  5. 根据权利要求4所述的方法,其特征在于,所述将所述执行核变更为大核之后,还包括:
    如果所述CPU的温度在第二预设时间内下降,将所述执行核变更为小核。
  6. 根据权利要求4所述的方法,其特征在于,根据下述公式调整需要设定的CPU频率值:
    Figure PCTCN2021127903-appb-100001
    所述P的取值满足公式:
    Figure PCTCN2021127903-appb-100002
    其中,freq new为需要设定的CPU频率,freq max为所述频率最大值,K为过度因子,load为CPU的当前负载,load max为CPU的最大负载,P为补偿因子,CPU temp为CPU的温度,fps 2D/3D为2D/3D的显示帧率。
  7. 根据权利要求1至3任一所述的方法,其特征在于,所述方法还包括:
    控制无执行任务的CPU核进入浅睡眠模式。
  8. 根据权利要求1至3任一所述的方法,其特征在于,所述根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整执行核之前,还包括:
    从应用层获取2D显示参数和3D显示参数;
    从CPU内核中获取所述CPU的温度和所述CPU的频率。
  9. 根据权利要求1所述的方法,其特征在于,所述目标线程至少包括:拍摄图像获取线程、2D算法处理线程和3D算法处理线程。
  10. 根据权利要求9所述的方法,其特征在于,所述方法还包括:
    通过所述拍摄图像获取从车辆的若干个相机获取拍摄图像,并将获取的拍摄图像存储到缓存区,所述拍摄图像包括2D图像和3D图像;
    每次从所述缓存区截取至少一帧2D图像发送至所述2D算法处理线程,并由所述2D算法处理线程对所述2D图像进行处理,以得 到可显示的2D环境信息;
    每次从所述缓存区截取至少一帧3D图像发送至所述3D算法处理线程,并由所述3D算法处理线程对所述3D图像进行处理,以得到可显示的3D环境信息。
  11. 根据权利要求10所述的方法,其特征在于,所述方法还包括:
    定义至少5个指针,所述指针包括第一指针、第二指针、第三指针、第四指针和第五指针;
    在每次从所述缓存区截取至少一帧2D图像时,第一指针和第二指针分别指向本次截取的开始位置和停止位置;
    在每次从所述缓存区截取至少一帧3D图像时,第三指针和第四指针分别指向本次截取的开始位置和停止位置;
    所述第五指针指向所述缓存区的最大容量位置,所述第一指针、第二指针、第三指针和第四指针指向的位置均不能超出第五指针指向的位置。
  12. 根据权利要求9至11任一所述的方法,其特征在于,至少在下述其中一个操作中使用Zero-Copy技术:获取拍摄图像、截取2D图像、截取3D图像。
  13. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    接收车载中控和/或档位控制器发出的控制信息;
    根据所述控制信号同步显示2D环境信息和3D环境信息。
  14. 根据权利要求1所述的方法,其特征在于,同步显示2D环境信息和3D环境信息时,显示2D环境信息的区域尺寸小于显示3D环境信息的区域尺寸。
  15. 一种行车影像显示装置,其特征在于,所述装置包括:
    参数获取模块,用于在同步显示2D环境信息和3D环境信息时, 获取2D显示参数和3D显示参数,其中,所述2D显示参数为显示2D环境信息时的参数,所述3D显示参数为显示3D环境信息时的参数;
    CPU运行信息获取模块,用于获取CPU的运行信息,所述CPU包含多个核;
    执行核调整模块,用于根据所述2D显示参数、所述3D显示参数和/或所述CPU的运行信息调整所述CPU的频率和/或变更执行核,所述执行核为运行目标线程的CPU核,所述目标线程为用于处理2D环境信息和/或3D环境信息的线程。
  16. 一种存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器运行时执行权利要求1至14任一项所述方法的步骤。
  17. 一种嵌入式设备,其特征在于,所述嵌入式设备的CPU为多核,所述嵌入式设备包括如权利要求15所述的行车影像显示装置,或者,所述嵌入式设备包括存储器和处理器,所述存储器上存储有可在所述处理器上运行的计算机程序,所述处理器运行所述计算机程序时执行权利要求1至14任一项所述方法的步骤。
  18. 一种行车影像显示平台,其特征在于,所述平台包括若干个采集车辆的环境信息并获取拍摄图像的相机、权利要求17所述的嵌入式设备和用于显示2D环境信息和/或3D环境信息的车载显示终端。
PCT/CN2021/127903 2020-11-30 2021-11-01 行车影像显示方法、装置及平台、存储介质、嵌入式设备 WO2022111225A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011373475.7 2020-11-30
CN202011373475.7A CN112486684B (zh) 2020-11-30 2020-11-30 行车影像显示方法、装置及平台、存储介质、嵌入式设备

Publications (1)

Publication Number Publication Date
WO2022111225A1 true WO2022111225A1 (zh) 2022-06-02

Family

ID=74937388

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/127903 WO2022111225A1 (zh) 2020-11-30 2021-11-01 行车影像显示方法、装置及平台、存储介质、嵌入式设备

Country Status (2)

Country Link
CN (1) CN112486684B (zh)
WO (1) WO2022111225A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115242695A (zh) * 2022-07-22 2022-10-25 高新兴物联科技有限公司 服务器的环境状态监测方法、设备及计算机可读存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112486684B (zh) * 2020-11-30 2022-08-12 展讯半导体(成都)有限公司 行车影像显示方法、装置及平台、存储介质、嵌入式设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7119831B2 (en) * 2002-01-17 2006-10-10 Sony Corporation Information providing apparatus, information providing method, storage medium, and computer program
CN105511824A (zh) * 2015-11-30 2016-04-20 深圳市灵动飞扬科技有限公司 分屏显示方法及***
CN106951320A (zh) * 2017-01-23 2017-07-14 斑马信息科技有限公司 动态调节互联网汽车的车机的cpu频率的***及其方法
CN107844177A (zh) * 2017-10-18 2018-03-27 歌尔科技有限公司 设备参数调整方法、装置及电子设备
CN110413417A (zh) * 2019-08-02 2019-11-05 广州小鹏汽车科技有限公司 车载***进程的运行优化方法、装置和***
CN110696720A (zh) * 2019-10-31 2020-01-17 广东好帮手丰诺电子科技有限公司 一种带原车按健控制的3d全景倒车***
CN112486684A (zh) * 2020-11-30 2021-03-12 展讯半导体(成都)有限公司 行车影像显示方法、装置及平台、存储介质、嵌入式设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160019062A1 (en) * 2014-07-16 2016-01-21 Ahmad Yasin Instruction and logic for adaptive event-based sampling
US9794340B2 (en) * 2014-09-15 2017-10-17 Ge Aviation Systems Llc Mechanism and method for accessing data in a shared memory
CN106598596A (zh) * 2016-12-14 2017-04-26 天津光电通信技术有限公司 基于Andorid平台的OpenCL图像处理方法
DE102017109239A1 (de) * 2017-04-28 2018-10-31 Ilnumerics Gmbh Computerimplementiertes verfahren, computerlesbares medium und heterogenes rechnersystem
JP2019057178A (ja) * 2017-09-21 2019-04-11 東芝メモリ株式会社 メモリシステムおよび制御方法
CN109947569B (zh) * 2019-03-15 2021-04-06 Oppo广东移动通信有限公司 绑定核心的方法、装置、终端及存储介质
CN110083460A (zh) * 2019-03-25 2019-08-02 华东师范大学 一种利用事件总线技术的微内核架构的设计方法
CN110532091B (zh) * 2019-08-19 2022-02-22 中国人民解放军国防科技大学 基于图形处理器的图计算边向量负载平衡方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7119831B2 (en) * 2002-01-17 2006-10-10 Sony Corporation Information providing apparatus, information providing method, storage medium, and computer program
CN105511824A (zh) * 2015-11-30 2016-04-20 深圳市灵动飞扬科技有限公司 分屏显示方法及***
CN106951320A (zh) * 2017-01-23 2017-07-14 斑马信息科技有限公司 动态调节互联网汽车的车机的cpu频率的***及其方法
CN107844177A (zh) * 2017-10-18 2018-03-27 歌尔科技有限公司 设备参数调整方法、装置及电子设备
CN110413417A (zh) * 2019-08-02 2019-11-05 广州小鹏汽车科技有限公司 车载***进程的运行优化方法、装置和***
CN110696720A (zh) * 2019-10-31 2020-01-17 广东好帮手丰诺电子科技有限公司 一种带原车按健控制的3d全景倒车***
CN112486684A (zh) * 2020-11-30 2021-03-12 展讯半导体(成都)有限公司 行车影像显示方法、装置及平台、存储介质、嵌入式设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115242695A (zh) * 2022-07-22 2022-10-25 高新兴物联科技有限公司 服务器的环境状态监测方法、设备及计算机可读存储介质
CN115242695B (zh) * 2022-07-22 2023-08-15 高新兴物联科技股份有限公司 服务器的环境状态监测方法、设备及计算机可读存储介质

Also Published As

Publication number Publication date
CN112486684A (zh) 2021-03-12
CN112486684B (zh) 2022-08-12

Similar Documents

Publication Publication Date Title
WO2022111225A1 (zh) 行车影像显示方法、装置及平台、存储介质、嵌入式设备
US11480804B2 (en) Distributed foveated rendering based on user gaze
ES2959308T3 (es) Reducción del ancho de banda de pantalla con múltiples resoluciones
ES2922054T3 (es) Soporte de múltiples tasas de refresco en regiones diferentes del visor del panel
US20230319395A1 (en) Service processing method and device
US11178424B2 (en) Adaptive foveated encoder and global motion predictor
US10643358B2 (en) HDR enhancement with temporal multiplex
US9672586B2 (en) Image synthesis method with DSP and GPU
JP2021022354A (ja) 強化された高ダイナミック・レンジ画像化及びトーン・マッピング
US11800232B2 (en) Object pre-encoding for 360-degree view for optimal quality and latency
US10846815B2 (en) Policies and architecture to dynamically offload VR processing to HMD based on external cues
EP4120077A1 (en) Method for scheduling hardware accelerator, and task scheduler
DE102019117218A1 (de) Reduziertes Rendern eines Videos mit sechs Freiheitsgraden
CN110730304B (zh) 一种加速图像采集和显示的智能相机
DE112017000864T5 (de) Strahlenkomprimierung für effizientes Verarbeiten von Grafikdaten bei Rechenvorrichtungen
US11373273B2 (en) Method and device for combining real and virtual images
CN115712351B (zh) 一种面向多人远程混合现实共享场景的分级渲染与交互方法和***
DE112018007659T5 (de) Objekterkennung und -verfolgung für autonomes fahren unter nutzung von schatten und reflexionen
DE102022134250A1 (de) Technologien für eine selektive Niedrigleistungs-Rahmenaktualisierung auf einer Anzeige
US10877811B1 (en) Scheduler for vector processing operator allocation
CN114257707A (zh) 具有彩色夜间模式的智能ip摄像头
US10896525B2 (en) Graphics system and method for use of sparse textures
US20230115371A1 (en) Efficient vision perception
US11216307B1 (en) Scheduler for vector processing operator readiness
US10699368B1 (en) Memory allocation techniques for graphics shader

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21896726

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21896726

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 150124)