CN114945072A - Dual-camera frame synchronization processing method and device, user terminal and storage medium - Google Patents

Dual-camera frame synchronization processing method and device, user terminal and storage medium Download PDF

Info

Publication number
CN114945072A
CN114945072A CN202210415523.7A CN202210415523A CN114945072A CN 114945072 A CN114945072 A CN 114945072A CN 202210415523 A CN202210415523 A CN 202210415523A CN 114945072 A CN114945072 A CN 114945072A
Authority
CN
China
Prior art keywords
image data
visible light
data frame
frame
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210415523.7A
Other languages
Chinese (zh)
Inventor
蔡伟明
胡明
陈闰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uni Trend Technology China Co Ltd
Original Assignee
Uni Trend Technology China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uni Trend Technology China Co Ltd filed Critical Uni Trend Technology China Co Ltd
Priority to CN202210415523.7A priority Critical patent/CN114945072A/en
Publication of CN114945072A publication Critical patent/CN114945072A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

A method, a device, a user terminal and a storage medium for frame synchronization processing of double cameras are provided, which comprises an infrared camera and a visible light camera, and comprises the following steps: the method comprises the steps of parameter acquisition, infrared image data acquisition, visible light image data acquisition, timestamp retrieval and pairing judgment. The method, the device, the user terminal and the storage medium for processing the frame synchronization of the double-camera are used for processing the frame synchronization of the infrared image and the visible light image more simply and efficiently in a soft synchronization mode, and can be used for processing the frame synchronization of the binocular cameras with different frame rates or the same frame rate, so that the consistency and the timeliness of the processing of the infrared image and the visible light image are improved.

Description

Dual-camera frame synchronization processing method and device, user terminal and storage medium
Technical Field
The present application relates to the field of image data processing technologies, and in particular, to a method and an apparatus for frame synchronization processing with two cameras, a user terminal, and a storage medium.
Background
Most of the existing camera devices have two cameras at the same end, wherein one camera is an infrared camera and the other camera is a visible light camera. When a thermal imaging panoramic picture is generated, frame data synchronization processing needs to be performed on an infrared image acquired by an infrared camera and a visible light image acquired by a visible light camera, and then subsequent splicing processing is performed.
In the process of implementing the technical solution in the embodiment of the present application, the inventors of the present application find that the above-mentioned technology has at least the following technical problems:
the camera device in the market realizes frame synchronization between the infrared image and the visible light image by adopting a hard synchronization mode, specifically, the same hardware is used for simultaneously issuing a trigger acquisition command, and time synchronization of acquisition and measurement of each sensor is realized, for example: the infrared camera and the visible light camera achieve simultaneous exposure control through the same hardware pulse signal. When the method is adopted, only infrared cameras and visible light cameras with the same frame rate can be processed, and certain limitation is realized.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
In view of at least one of the above technical problems, the present application provides a method and an apparatus for processing frame synchronization of dual cameras, a user terminal, and a storage medium, so as to solve the problem that most of the existing commercial camera apparatuses implement frame synchronization of infrared images and visible light images in a hard synchronization manner, and such a processing manner can only process infrared cameras and visible light cameras with the same frame rate, and has certain limitations.
In order to achieve the above object, an embodiment according to a first aspect of the present application provides a method for frame synchronization processing of dual cameras, including an infrared camera and a visible light camera, including:
a parameter acquisition step: acquiring a first camera shooting parameter of an infrared camera and a second camera shooting parameter of a visible light camera, and generating a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, wherein the first camera shooting parameter comprises a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter comprises a second frame rate, a second resolution and a second image processing bandwidth;
acquiring infrared image data: acquiring an infrared image data frame and storing the infrared image data frame in a first data buffer pool, wherein the infrared image data frame is provided with a first time stamp;
visible light image data acquisition: acquiring a visible light image data frame and storing the visible light image data frame in a second data buffer pool, wherein the visible light image data frame has a second time stamp;
a time stamp retrieval step: taking out infrared image data frames from a first data buffer pool based on first gap trigger time, and taking out visible light image data frames from a second data buffer pool based on second gap trigger time;
a pairing judgment step: and judging whether the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame, and if so, optimizing the taken infrared image data frame and the visible light image data frame.
In one implementation manner, the pairing determining step further includes:
when the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame, generating an identification mark on the infrared image data frame and the visible light image data frame;
identifying whether the infrared image data frame and the visible light image data frame have identification marks or not;
if yes, stopping the timestamp searching step.
In one implementation mode, after optimizing an infrared image data frame and a visible light image data frame with identification marks, eliminating the identification marks of the infrared image data frame and the visible light image data frame;
the timestamp retrieval step is repeated.
In one implementation, if the first data buffer pool is not empty, a timestamp retrieval step is started;
and if the second data buffer pool is not empty, starting the timestamp retrieval step.
In one implementation, the timestamp retrieving step further comprises: a timeout range interval;
generating waiting time after taking out the infrared image data frame from the first data buffer pool based on the first gap triggering time;
or, after the visible light image data frame is taken out from the second data buffer pool based on the second gap trigger time, generating a waiting time;
and if the waiting time exceeds the overtime range interval, the timestamp retrieval step is carried out again.
In one implementation, if the first frame rate is the same as the second frame rate, the timeout period is 0 to 50 ms.
In one implementation manner, if the first frame rate is different from the second frame rate, the timeout period is 0 to 100 ms.
To achieve the above object, an embodiment according to a second aspect of the present application provides a dual-camera frame synchronization processing apparatus, including
The parameter acquisition module is used for acquiring a first camera shooting parameter of the infrared camera and a second camera shooting parameter of the visible light camera, and generating a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, wherein the first camera shooting parameter comprises a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter comprises a second frame rate, a second resolution and a second image processing bandwidth;
the first acquisition module is used for acquiring an infrared image data frame and storing the infrared image data frame in a first data buffer pool, wherein the infrared image data frame is provided with a first time stamp;
the second acquisition module is used for acquiring a visible light image data frame and storing the visible light image data frame in a second data buffer pool, wherein the visible light image data frame has a second time stamp;
the time stamp retrieval module is used for taking out the infrared image data frames from the first data buffer pool based on the first gap triggering time and taking out the visible light image data frames from the second data buffer pool based on the second gap triggering time;
and the pairing judgment module is used for judging whether the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame.
According to the method or the device for processing the frame synchronization of the double cameras, the frame synchronization problem of the infrared images and the frame synchronization problem of the visible light images are processed more simply and efficiently in a soft synchronization mode, the frame synchronization problem of the binocular cameras with different frame rates or the same frame rate can be processed, and the consistency and the effectiveness of processing the infrared images and the visible light images are further improved.
To achieve the above object, according to a third aspect of the present application, there is provided a user terminal, including a memory and a processor;
a memory for storing a computer program;
a processor for implementing the dual-camera frame synchronization processing method according to the first aspect of the present application when executing a computer program.
To achieve the above object, according to an embodiment of a fourth aspect of the present application, there is provided a storage medium having a computer program stored thereon, which, when executed by a processor, implements the dual-camera frame synchronization processing method according to the first aspect of the present application.
The invention is further described with reference to the following figures and examples.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained based on the drawings without creative efforts.
Fig. 1 is a schematic diagram of a first principle of a dual-camera frame synchronization processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a second principle of a dual-camera frame synchronization processing method according to an embodiment of the present application.
Fig. 3 is a schematic diagram illustrating a pairing determination step according to an embodiment of the present application.
Fig. 4 is a schematic block diagram of a dual-camera frame synchronization processing apparatus according to an embodiment of the present application.
Fig. 5 is a block diagram of an exemplary user terminal used to implement embodiments of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and that modifications may be made by one skilled in the art without departing from the spirit and scope of the application and it is therefore not intended to be limited to the specific embodiments disclosed below.
The terms used in the embodiments of the present application will be described below.
Frame rate: is a measure for measuring the number of display frames.
Resolution ratio: the amount of information stored in an image is how many pixels are in each inch of the image.
Time stamping: the data is generated by using a digital signature technology, and the signed object comprises original file information, signature parameters, signature time and other information.
The existing camera device adopts a hard synchronization mode to realize frame synchronization of an infrared image and a visible light image, specifically, the same hardware is used for simultaneously issuing a trigger acquisition command, and time synchronization of acquisition and measurement of each sensor is realized, for example: the infrared camera and the visible light camera achieve simultaneous exposure control through the same hardware pulse signal. When the method is adopted, only infrared cameras and visible light cameras with the same frame rate can be processed, and due to the fact that the resolution ratios of the infrared cameras and the visible light cameras are not consistent, data transmission can be delayed to a certain degree, and the size data of the sampled images are not consistent, and therefore the image timestamps sampled by the infrared cameras and the visible light cameras are different to a certain degree. If the infrared camera and the visible light camera with different frame rates are processed in a hard synchronization mode, the frame rates are inconsistent, the resolutions are inconsistent, the data transmission is necessarily inconsistent, and the image timestamps sampled by the infrared camera and the visible light camera are also greatly different.
The application relates to an implementation environment comprising: the terminal equipment comprises a target scene and the terminal equipment with an infrared camera and a visible light camera. The terminal equipment comprises a mobile phone, a computer, a tablet computer, an infrared thermal imager and the like. The method for processing the frames of the two cameras synchronously is mainly applied to an infrared thermal imager, after the infrared camera and the visible light camera of the infrared thermal imager acquire images, the infrared image and the visible light image with consistent timestamps are retrieved by adopting a hook function, and then the infrared image and the visible light image are processed.
Fig. 1 is a schematic diagram of a first principle of a dual-camera frame synchronization processing method according to an embodiment of the present application. As shown in fig. 1, an embodiment of a first aspect of the present application provides a method for frame synchronization processing of two cameras, including an infrared camera and a visible light camera, including:
step 100 is a parameter acquisition step: the method comprises the steps of obtaining a first camera shooting parameter of an infrared camera and a second camera shooting parameter of a visible light camera, and generating a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, wherein the first camera shooting parameter comprises a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter comprises a second frame rate, a second resolution and a second image processing bandwidth.
In step 100, the first camera parameters of the infrared camera are specific parameters of the infrared camera. Similarly, the second camera shooting parameter of the visible light camera is also a specific parameter of the visible light camera. The method comprises the steps of obtaining parameters of an infrared camera and a visible light camera, and calculating and generating first gap triggering time and second gap starting time according to a hook function.
The first gap triggering time refers to the gap time of the hook function for taking out two adjacent infrared image data frames from the first data buffer pool. The second gap triggering time refers to the gap time of the hook function for taking out two adjacent frames of visible light image data frames in the second data buffer pool. In this way, the search rate of the hook function in the data buffer pool can be increased.
The hook function is calculated as:
Figure BDA0003605750230000071
when the frame rates of the infrared camera and the visible light camera are the same, taking the visible light camera as an example: the processing bandwidth is a second image processing bandwidth, the frame rate is set as a second frame rate, the resolution is a second resolution, and the trigger frequency is the sum of the reciprocal of the first frame rate and the reciprocal of the second frame rate.
When the frame rates of the infrared camera and the visible light camera are the same, taking the infrared camera as an example: the processing bandwidth is a first image processing bandwidth, the set frequency is a first frequency, the resolution is a first resolution, and the trigger frequency is the sum of the reciprocal of the first frame rate and the reciprocal of the second frame rate.
When the frame rates of the infrared camera and the visible light camera are different, taking the visible light camera as an example, the processing bandwidth is the second image processing bandwidth, the frame rate is set to be the second frame rate, the resolution is the second resolution, and the trigger frequency is the sum of the reciprocal of the first frame rate and the reciprocal of the second frame rate.
When the frame rates of the infrared camera and the visible light camera are different, taking the infrared camera as an example: the processing bandwidth is a first image processing bandwidth, the set frequency is a first frequency, the resolution is a first resolution, and the trigger frequency is the sum of the reciprocal of the first frame rate and the reciprocal of the second frame rate.
Thus, the gap trigger time of the hook function is determined by the first imaging parameter and the second imaging parameter.
Step 200 is an infrared image data acquisition step: an infrared image data frame is acquired and stored in a first data buffer pool, the infrared image data frame having a first timestamp.
In step 200, the infrared image data frame is captured by an infrared camera. After the infrared image data frame is obtained, the infrared image data frame is stored in a first data buffer pool, wherein the first data buffer pool is specially used for placing the infrared image data frame. And a first time stamp is generated on the infrared image data frame of each frame.
Step 300 is a visible light image data acquisition step: a frame of visible light image data is acquired and stored in a second data buffer pool, the frame of visible light image data having a second time stamp.
In step 300, a frame of visible light image data is captured by a visible light camera. And after the visible light image data frame is obtained, storing the visible light image data frame in a second data buffer pool, wherein the second data buffer pool is specially used for placing the visible light image data frame. And a second time stamp is generated on the visible light image data frame of each frame.
Step 400 is a timestamp retrieval step: and taking out the infrared image data frames from the first data buffer pool based on the first gap trigger time, and taking out the visible light image data frames from the second data buffer pool based on the second gap trigger time.
In step 400, when an infrared image data frame is stored in the first data buffer pool, the hook function starts to capture the infrared image data frame in the first data buffer pool, and captures the infrared image data frame once every time, and then captures the infrared image data frame once again at intervals of the first interval trigger time, and the process is repeated until no infrared image data frame exists in the first data buffer pool. When the visible light image data frame is stored in the second data buffer pool, the principle is the same as that when the infrared image data frame is stored in the first data buffer pool, and the description is not repeated here. Therefore, by setting the first gap triggering time and the second gap triggering time, data can be independently and rapidly taken out from the first data buffer pool and the second data buffer pool, and the acquisition rate of infrared image data frames and visible light image data frames is further improved.
Step 500 is a pairing judgment step: and judging whether the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame, and if so, optimizing the taken infrared image data frame and the visible light image data frame.
In step 500, it is determined whether the first timestamp of the extracted infrared image data frame is the same as the second timestamp of the visible light image data frame, and if so, the optimization processing is started for the extracted infrared image data frame and the extracted visible light image data frame. And if the two data frames are different, the infrared image data frame and the visible light image data frame which are taken out are put back into the corresponding first data buffer pool and the second data buffer pool again so as to be convenient for the subsequent hook function to recapture.
Fig. 2 is a schematic diagram of a second principle of a dual-camera frame synchronization processing method according to an embodiment of the present application. Fig. 3 is a schematic diagram illustrating a pairing determination step according to an embodiment of the present application. As shown in fig. 2 and fig. 3, in another embodiment, the pairing step further includes:
when the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame, generating an identification mark on the infrared image data frame and the visible light image data frame;
step 510 is an identification step: identifying whether the infrared image data frame and the visible light image data frame have identification marks or not;
if yes, stopping the timestamp searching step.
In this embodiment, the identification flag is used to identify whether the infrared image data frame and the visible light image data frame complete the timestamp pairing, and the timestamp retrieval step is stopped according to the identification flag, that is, the hook function is stopped to continuously take out the infrared image data frame and the visible light image data frame from the first data buffer pool and the second data buffer pool. Specifically, when the first timestamp of the extracted infrared image data frame is different from the second timestamp of the extracted visible light image data frame, no identification mark is generated on the infrared image data frame and the visible light image data frame.
In other embodiments, after the infrared image data frame and the visible light image data frame with the identification marks are optimized, the identification marks of the infrared image data frame and the visible light image data frame are eliminated;
the timestamp retrieval step is repeated.
In this embodiment, after the infrared image data frame and the visible light image data frame with the identification marks are optimized, the identification marks of the infrared image data frame and the visible light image data frame are eliminated. And identifying the identification mark again, and if the identification mark is not identified, performing the time stamp retrieval step again.
In other embodiments, if the first data buffer pool is not empty, the timestamp retrieval step is initiated;
and if the second data buffer pool is not empty, starting the timestamp retrieval step.
In this embodiment, the hook function performs data capture in the first data buffer pool and the second data buffer pool, respectively. Therefore, when the first data buffer pool and/or the second data buffer pool are/is empty, the hook function can quickly respond to capture data, and therefore capture efficiency of infrared image data frames and visible light image data frames is improved.
In further embodiments, the timestamp retrieving step further comprises: a timeout range interval;
generating waiting time after taking out the infrared image data frame from the first data buffer pool based on the first gap triggering time;
or, after the visible light image data frame is taken out from the second data buffer pool based on the second gap trigger time, generating a waiting time;
and if the waiting time exceeds the overtime range interval, the timestamp retrieval step is carried out again.
In this embodiment, by setting the timeout range interval, it is beneficial to shorten the waiting time after the infrared image data frame or the visible image data frame is taken out, thereby causing the problem of occupying processing resources.
In other embodiments, if the first frame rate is the same as the second frame rate, the timeout period is 0-50 ms.
In other embodiments, if the first frame rate is different from the second frame rate, the timeout period is 0-100 ms.
As shown in fig. 4, a dual-camera frame synchronization processing apparatus 600 according to an embodiment of the second aspect of the present application includes:
the parameter acquiring module 610 is configured to acquire a first camera shooting parameter of the infrared camera and a second camera shooting parameter of the visible light camera, and generate a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, where the first camera shooting parameter includes a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter includes a second frame rate, a second resolution and a second image processing bandwidth;
a first obtaining module 620, configured to obtain an infrared image data frame, and store the infrared image data frame in a first data buffer pool, where the infrared image data frame has a first timestamp;
a second obtaining module 630, configured to obtain a visible light image data frame and store the visible light image data frame in a second data buffer pool, where the visible light image data frame has a second timestamp;
a timestamp retrieval module 640, configured to take out an infrared image data frame from the first data buffer pool based on the first gap trigger time, and take out a visible light image data frame from the second data buffer pool based on the second gap trigger time;
and the pairing judgment module 650 is configured to judge whether the first timestamp of the extracted infrared image data frame is the same as the second timestamp of the extracted visible light image data frame.
According to the method or the device for processing the frame synchronization of the double cameras, the frame synchronization problem of the infrared images and the frame synchronization problem of the visible light images are processed more simply and efficiently in a soft synchronization mode, the frame synchronization problem of the binocular cameras with different frame rates or the same frame rate can be processed, and the consistency and the effectiveness of processing the infrared images and the visible light images are further improved.
Fig. 4 is a schematic block diagram of a dual-camera frame synchronization processing apparatus according to an embodiment of the present application, and more specific implementation manners of the modules of the dual-camera frame synchronization processing apparatus 600 may refer to descriptions of the dual-camera frame synchronization processing method according to the present invention, and have similar beneficial effects, and are not described herein again.
According to an embodiment of a third aspect of the present application, there is provided a user terminal comprising a memory and a processor;
a memory for storing a computer program;
a processor for implementing the dual-camera frame synchronization processing method according to the first aspect of the present application when executing a computer program.
The user terminal of the third aspect of the present application may be implemented by referring to the content specifically described in the embodiment of the first aspect of the present application, and has similar beneficial effects to the method for processing frame synchronization of two cameras according to the embodiment of the first aspect of the present application, and details are not repeated here.
Fig. 5 is a block diagram of an exemplary user terminal for implementing the embodiments of the present application, and the user terminal shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the user terminal 10 may be implemented as a general purpose computing device. The components of the user terminal 10 may include, but are not limited to: one or more processors or processing units 11, a system memory 12, and a bus 13 that couples various system components including the system memory 12 and the processing units 11.
Bus 13 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
User terminal 10 typically includes a variety of computer system readable media. These media may be any available media that may be accessed by user terminal 10 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 12 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 14 and/or cache 15 Memory. The user terminal 10 may further include other removable/non-removable, volatile/nonvolatile computer-readable storage media. By way of example only, storage system 16 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown, commonly referred to as a "hard drive"). Although not shown in FIG. 5, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only memory (CD-ROM), a Digital versatile disk Read Only memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 13 by one or more data media interfaces. The memory may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the disclosure.
A program/utility 18 having a set (at least one) of program modules 17 may be stored, for example, in memory, such program modules 17 including but not limited to an operating system, one or more application programs, other program modules 17, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 17 generally perform the functions and/or methods of the embodiments described in this disclosure.
The user terminal 10 may also communicate with one or more external devices 19 (e.g., keyboard, pointing device, display 20, etc.), one or more devices that enable a user to interact with the computer system/server, and/or any devices (e.g., network card, modem, etc.) that enable the computer system/server to communicate with one or more other user terminals 10. Such communication may be through an input/output (I/O) interface 21. Moreover, the user terminal 10 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network (e.g., the Internet) via the Network adapter 22. As shown, the network adapter 22 communicates with the other modules of the user terminal 10 over the bus 13. It is noted that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the user terminal 10, including but not limited to: microcode, device drivers, redundant processing units 11, external disk drive arrays, RAID systems, tape drives, and data backup storage systems 16, etc.
The processing unit 11 executes various functional applications and data processing by executing programs stored in the system memory 12, for example, implementing the methods mentioned in the foregoing embodiments.
The user terminal 10 of the embodiment of the present application may be a server or a computationally-limited terminal device.
According to an embodiment of the fourth aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the dual-camera frame synchronization processing method according to the first aspect of the present application.
Generally, computer instructions for carrying out the methods of the present invention may be carried using any combination of one or more computer-readable storage media. Non-transitory computer readable storage media may include any computer readable medium except for the signal itself, which is temporarily propagating.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM14), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages, and in particular may employ Python languages suitable for neural network computing and TensorFlow, PyTorch-based platform frameworks. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The foregoing is merely a preferred embodiment of the present application and is not intended to limit the present application in any way. Those skilled in the art can now make numerous possible variations and modifications to the disclosed embodiments, or modify equivalent embodiments, using the methods and techniques disclosed above, without departing from the scope of the claimed embodiments. Therefore, all equivalent changes made according to the shape, structure and principle of the present application without departing from the content of the technical scheme of the present application should be covered in the protection scope of the present application.

Claims (10)

1. A double-camera frame synchronization processing method comprises an infrared camera and a visible light camera, and is characterized by comprising the following steps:
a parameter acquisition step: acquiring a first camera shooting parameter of the infrared camera and a second camera shooting parameter of the visible light camera, and generating a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, wherein the first camera shooting parameter comprises a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter comprises a second frame rate, a second resolution and a second image processing bandwidth;
acquiring infrared image data: acquiring an infrared image data frame, and storing the infrared image data frame in a first data buffer pool, wherein the infrared image data frame has a first time stamp;
visible light image data acquisition: acquiring a visible light image data frame and storing the visible light image data frame in a second data buffer pool, wherein the visible light image data frame has a second time stamp;
and a timestamp retrieval step: taking out infrared image data frames from the first data buffer pool based on the first gap trigger time, and taking out visible light image data frames from the second data buffer pool based on the second gap trigger time;
a pairing judgment step: and judging whether the first time stamp of the taken infrared image data frame is the same as the second time stamp of the taken visible light image data frame, and if so, optimizing the taken infrared image data frame and the taken visible light image data frame.
2. The method for frame synchronization processing with two cameras according to claim 1, wherein the step of determining pairing further comprises:
when the first time stamp of the extracted infrared image data frame is the same as the second time stamp of the extracted visible light image data frame, generating identification marks on the infrared image data frame and the visible light image data frame;
identifying whether the infrared image data frame and the visible light image data frame have identification marks or not;
and if so, stopping the timestamp retrieval step.
3. The method according to claim 2, wherein the infrared image data frame and the visible light image data frame with the identification marks are optimized to eliminate the identification marks of the infrared image data frame and the visible light image data frame;
and re-performing the timestamp retrieval step.
4. The method of claim 1, wherein if the first data buffer pool is not empty, the timestamp retrieval step is started;
and if the second data buffer pool is not empty, starting a timestamp retrieval step.
5. The dual-camera frame synchronization processing method according to claim 1, wherein the time stamp retrieving step further comprises: a timeout range interval;
generating a waiting time after taking out the infrared image data frame from the first data buffer pool based on the first gap triggering time;
or, after the visible light image data frame is taken out from the second data buffer pool based on the second gap trigger time, generating a waiting time;
and if the waiting time exceeds the overtime range interval, the timestamp retrieval step is carried out again.
6. The method according to claim 5, wherein if the first frame rate is the same as the second frame rate, the timeout period is 0-50 ms.
7. The method according to claim 5, wherein if the first frame rate is different from the second frame rate, the timeout period is 0-100 ms.
8. A frame synchronization processing device with two cameras is characterized by comprising
The parameter acquisition module is used for acquiring a first camera shooting parameter of the infrared camera and a second camera shooting parameter of the visible light camera, and generating a first gap trigger time and a second gap trigger time according to the first camera shooting parameter and the second camera shooting parameter, wherein the first camera shooting parameter comprises a first frame rate, a first resolution and a first image processing bandwidth, and the second camera shooting parameter comprises a second frame rate, a second resolution and a second image processing bandwidth;
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an infrared image data frame and storing the infrared image data frame in a first data buffer pool, and the infrared image data frame is provided with a first timestamp;
the second acquisition module is used for acquiring a visible light image data frame and storing the visible light image data frame in a second data buffer pool, wherein the visible light image data frame is provided with a second time stamp;
the time stamp retrieval module is used for taking out the infrared image data frames from the first data buffer pool based on the first gap triggering time and taking out the visible light image data frames from the second data buffer pool based on the second gap triggering time;
and the pairing judgment module is used for judging whether the first timestamp of the taken infrared image data frame is the same as the second timestamp of the taken visible light image data frame.
9. A user terminal comprising a memory and a processor;
the memory is used for storing a computer program;
the processor, when executing the computer program, is configured to implement the method of processing frame synchronization for dual cameras according to any one of claims 1 to 7.
10. A storage medium having stored thereon a computer program which, when executed by a processor, implements the dual-camera frame synchronization processing method according to any one of claims 1 to 7.
CN202210415523.7A 2022-04-20 2022-04-20 Dual-camera frame synchronization processing method and device, user terminal and storage medium Pending CN114945072A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210415523.7A CN114945072A (en) 2022-04-20 2022-04-20 Dual-camera frame synchronization processing method and device, user terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210415523.7A CN114945072A (en) 2022-04-20 2022-04-20 Dual-camera frame synchronization processing method and device, user terminal and storage medium

Publications (1)

Publication Number Publication Date
CN114945072A true CN114945072A (en) 2022-08-26

Family

ID=82906723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210415523.7A Pending CN114945072A (en) 2022-04-20 2022-04-20 Dual-camera frame synchronization processing method and device, user terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114945072A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115883815A (en) * 2022-10-28 2023-03-31 珠海视熙科技有限公司 Image data output method and device, lower computer and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115883815A (en) * 2022-10-28 2023-03-31 珠海视熙科技有限公司 Image data output method and device, lower computer and storage medium

Similar Documents

Publication Publication Date Title
US20210044778A1 (en) Systems and methods for video splicing and displaying
US8063947B2 (en) Image shooting device
US11048913B2 (en) Focusing method, device and computer apparatus for realizing clear human face
WO2019134516A1 (en) Method and device for generating panoramic image, storage medium, and electronic apparatus
US11024052B2 (en) Stereo camera and height acquisition method thereof and height acquisition system
WO2019184499A1 (en) Video call method and device, and computer storage medium
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108200335B (en) Photographing method based on double cameras, terminal and computer readable storage medium
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN108513069B (en) Image processing method, image processing device, storage medium and electronic equipment
CN112668636B (en) Camera shielding detection method and system, electronic equipment and storage medium
CN112085775A (en) Image processing method, device, terminal and storage medium
CN112383772B (en) Camera performance automatic test method and device, electronic equipment and storage medium
US20230336878A1 (en) Photographing mode determination method and apparatus, and electronic device and storage medium
CN114007044A (en) Opencv-based image splicing system and method
WO2016145831A1 (en) Image acquisition method and device
CN112308018A (en) Image identification method, system, electronic equipment and storage medium
CN114945072A (en) Dual-camera frame synchronization processing method and device, user terminal and storage medium
CN110730305A (en) Multi-source snapshot image processing and accessing method and device based on blocking queue
WO2019196240A1 (en) Photographing method, apparatus, computer device, and storage medium
CN111432121A (en) Generation method, electronic device, and storage medium
JP2018014572A (en) Information processing apparatus, image processing system, and program
CN112308809B (en) Image synthesis method, device, computer equipment and storage medium
CN108965694B (en) Method for acquiring gyroscope information for camera level correction and portable terminal
CN112843736A (en) Method and device for shooting image, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination