CN115883815A - Image data output method and device, lower computer and storage medium - Google Patents

Image data output method and device, lower computer and storage medium Download PDF

Info

Publication number
CN115883815A
CN115883815A CN202211335128.4A CN202211335128A CN115883815A CN 115883815 A CN115883815 A CN 115883815A CN 202211335128 A CN202211335128 A CN 202211335128A CN 115883815 A CN115883815 A CN 115883815A
Authority
CN
China
Prior art keywords
lower computer
map
target
depth map
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211335128.4A
Other languages
Chinese (zh)
Inventor
姚紫微
罗德祥
刘敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Shixi Technology Co Ltd
Original Assignee
Zhuhai Shixi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Shixi Technology Co Ltd filed Critical Zhuhai Shixi Technology Co Ltd
Priority to CN202211335128.4A priority Critical patent/CN115883815A/en
Publication of CN115883815A publication Critical patent/CN115883815A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The application discloses a method and a device for outputting image data, a lower computer and a storage medium, which are used for enabling the upper computer to obtain a depth map and a color map after frame synchronization, and improving the frame synchronization of an RGBD camera. The method comprises the following steps: the lower computer obtains the current working mode of the upper computer; if the working mode is a double-camera working mode, the lower computer controls a TOF camera and an RGB camera to be started, and obtains a depth map output by the TOF camera and a color map output by the RGB camera; the lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map; and the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and outputs data in the UVC data transmission buffer area to the upper computer through a UVC protocol.

Description

Image data output method and device, lower computer and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a method and an apparatus for outputting image data, a lower computer, and a storage medium.
Background
The RGBD camera can form an RGBD image when collecting images, the RGBD image is composed of a color (RGB) image and a Depth (Depth) image, so that the images output by the RGBD camera simultaneously contain three-dimensional information and color information.
Among the prior art, RGBD camera on the market mostly is all do not map with the degree of depth module by RGB module, not only need appoint the resolution ratio of two modules respectively through host computer client when using to the words of two equipment pictures, the frame synchronization of RGB picture and degree of depth map realizes complicacy, needs to carry out frame synchronization at the host computer, but carries out frame synchronization's difficult point at the host computer and lies in: 1. a UVC (USB Video Class, USB Video capture device) protocol does not carry timestamp information during image transmission, that is, a timestamp cannot be used by an upper computer to perform frame synchronization; 2. two UVC camera equipment need be opened respectively to the host computer, and the characteristic of different equipment is different, and the unable accurate two UVC camera equipment of control of host computer are started simultaneously, also can lead to the synchronism of frame not high.
Disclosure of Invention
The application provides a method and a device for outputting image data, a lower computer and a storage medium, which are used for enabling the upper computer to obtain a depth map and a color map after frame synchronization, and improving the frame synchronization of an RGBD camera.
A first aspect of the present application provides a method of image data output, comprising:
the lower computer obtains the current working mode of the upper computer;
if the working mode is a double-camera working mode, the lower computer controls a TOF camera and an RGB camera to be started, and obtains a depth map output by the TOF camera and a color map output by the RGB camera;
the lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map;
and the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and outputs data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
Optionally, the lower computer compares the timestamp of the depth map with the timestamp of the color map to determine a synchronized target depth map and a synchronized target color map, and the method includes:
the lower computer stores the depth map and the corresponding timestamps into a first cache queue, and stores the color map and the corresponding timestamps into a second cache queue;
the lower computer determines the color image positioned at the head of the second cache queue as a target color image;
and the lower computer matches the synchronous target depth map in the first cache queue according to the timestamp of the target color map.
Optionally, the matching, by the lower computer, the synchronized target depth map in the first buffer queue according to the timestamp of the target color map includes:
traversing the first cache queue from left to right by the lower computer, and calculating a time stamp difference value between the time stamp of the target color image and the time stamp of the depth image in the first cache queue;
and determining the depth map with the timestamp difference value smaller than a preset value as a target depth map.
Optionally, after the lower computer matches the synchronized target depth map in the first buffer queue according to the timestamp of the target color map, the method further includes:
and the lower computer releases the data in the first buffer queue before the target depth map.
Optionally, the lower computer acquiring the depth map output by the TOF camera includes:
the lower computer acquires a phase diagram output by the TOF camera;
and the lower computer converts the phase diagram into a depth diagram and an infrared diagram.
Optionally, after the lower computer determines a synchronized target depth map and a target color map by comparing the first timestamp information of the depth map with the second timestamp information of the color map, the method further includes:
the lower computer determines a target infrared image according to the target depth image, and the time stamps of the target infrared image and the target depth image are the same;
the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and the method comprises the following steps:
and the lower computer caches the target depth map, the target infrared map and the target color map to a UVC data transmission buffer area.
Optionally, after the lower computer obtains the current working mode of the upper computer, the method further includes:
the lower computer determines the current resolution of the upper computer;
and the lower computer determines the data content and the format of the depth map and the color map according to the resolution.
Optionally, the depth map is in a YUV format, and the color map is in a YUV format or an MJPEG format.
Optionally, after the lower computer acquires the current operating mode of the upper computer, the method includes:
if the working mode is a single-camera working mode, the lower computer determines the current resolution of the upper computer;
the lower computer controls the opening of the TOF camera or controls the opening of the RGB camera according to the resolution;
if the lower computer controls the TOF camera to be started, acquiring a phase diagram output by the TOF camera and outputting the phase diagram to the upper computer;
and if the lower computer controls the RGB camera to be started, acquiring a color image output by the RGB camera and outputting the color image to the upper computer.
Optionally, the phase map is in a YUV format, and the color map is in an MJPEG format.
A second aspect of the application provides a lower computer, comprising:
the acquisition unit is used for acquiring the current working mode of the upper computer;
the first control unit is used for controlling the opening of the TOF camera and the RGB camera when the working mode is a double-camera working mode, and acquiring a depth map output by the TOF camera and a color map output by the RGB camera;
the lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map;
the first output unit is used for caching the target depth map and the target color map to a UVC data transmission buffer area and outputting data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
Optionally, the synchronization unit includes:
the storing module is used for storing the depth map and the corresponding time stamp into a first cache queue and storing the color map and the corresponding time stamp into a second cache queue;
the determining module is used for determining the color image positioned at the head of the second cache queue as a target color image;
and the matching module is used for matching the timestamp of the target color image with the synchronous target depth image in the first buffer queue.
Optionally, the matching module is specifically configured to:
traversing the first cache queue from left to right by the lower computer, and calculating a time stamp difference value between the time stamp of the target color image and the time stamp of the depth image in the first cache queue;
and determining the depth map with the timestamp difference value smaller than a preset value as a target depth map.
Optionally, the synchronization unit further includes:
and the releasing module is used for releasing the data in the first buffer queue before the target depth map.
Optionally, the first control unit is specifically configured to:
acquiring a phase diagram output by the TOF camera;
the phase map is converted into a depth map and an infrared map.
Optionally, the synchronization unit is further configured to:
determining a target infrared image according to the target depth image, wherein the target infrared image and the target depth image have the same time stamp;
the first output unit is specifically configured to:
and caching the target depth map, the target infrared map and the target color map into a UVC data transmission buffer area.
Optionally, the first control unit is further configured to:
determining the current resolution of the upper computer;
and determining the data content and the format of the depth map and the color map according to the resolution.
Optionally, the depth map is in a YUV format, and the color map is in a YUV format or an MJPEG format.
Optionally, the lower computer further includes:
the second control unit is used for determining the current resolution of the upper computer when the working mode is a single-camera working mode, and controlling the opening of the TOF camera or the opening of the RGB camera according to the resolution;
the second output unit is used for acquiring a phase diagram output by the TOF camera and outputting the phase diagram to the upper computer when the second control unit controls the TOF camera to be started;
and the third output unit is used for acquiring the color image output by the RGB camera and outputting the color image to the upper computer when the second control unit controls the RGB camera to be started.
A third aspect of the present application provides an apparatus for image data output, the apparatus comprising:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the memory holds a program that the processor calls to perform the method of the first aspect and the image data output selectable by any one of the first aspects.
A fourth aspect of the present application provides a computer-readable storage medium having a program stored thereon, the program, when executed on a computer, performing the method of image data output selectable from any one of the first aspect and the first aspect.
According to the technical scheme, the method has the following advantages:
the TOF camera and the RGB camera are connected to a lower computer, the working mode of the lower computer is designated by an upper computer when the working mode is a double-camera working mode, the lower computer controls the TOF camera and the RGB camera to be started to respectively obtain a depth map output by the TOF camera and a color map output by the RGB camera, the lower computer carries out frame synchronization on the obtained depth map and color map through timestamp information, and the depth map and color map after frame synchronization are transmitted to the upper computer through a UVC protocol.
Drawings
In order to more clearly illustrate the technical solutions in the present application, the drawings required for the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings may be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart illustrating an embodiment of a method for outputting image data provided by the present application;
FIG. 2 is a schematic flowchart of another embodiment of a method for outputting image data provided by the present application;
FIG. 3 is a schematic flow chart illustrating a method for outputting image data according to another embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an embodiment of a lower computer provided in the present application;
fig. 5 is a schematic structural diagram of an embodiment of an apparatus for outputting image data according to the present application.
Detailed Description
The application provides an image data output method, an image data output device, a lower computer and a storage medium, which are used for enabling the upper computer to obtain a depth map and a color map after frame synchronization, and improving the frame synchronization of an RGBD camera.
Referring to fig. 1, fig. 1 is a diagram illustrating an embodiment of a method for outputting image data according to the present application, the method including:
101. the lower computer obtains the current working mode of the upper computer;
it should be noted that the method for outputting image data provided by the present application is applied to a lower computer, for example, a network camera, a conference device, and other UVC camera devices with a camera. The lower computer is communicated with the upper computer through a UVC protocol, and the UVC protocol belongs to a USB equipment class standard protocol and is a unified data exchange standard of video equipment for a USB interface. When the device is used, the upper computer can designate a working mode of the lower computer, the lower computer body provides two working modes, namely a single-camera working mode and a double-camera working mode, wherein the single-camera working mode is that the lower computer only starts a TOF camera to transmit a depth map to the upper computer or only starts an RGB camera to transmit a color map to the upper computer; and in the working mode of the double cameras, the TOF camera and the RGB camera are simultaneously started, and a depth map and a color map are simultaneously transmitted to the upper computer.
It should be noted that the working mode in this application may be directly specified by the resolution, that is, the lower computer may determine the current working mode of the upper computer by the resolution specified by the upper computer. In some specific embodiments, taking ASR7205 platform (a master control chip) as an example, one 1280x962 TOF camera and one 1280x960 RGB camera are respectively connected to the platform. And developing a lower computer software program a (equivalent to the lower computer in the present application) based on the UVC protocol (image transmission, camera control, other extension protocols, etc.) on the platform, where the lower computer software program a constructs a USB camera for use by a host-side upper computer B (equivalent to the upper computer in the present application) through the UVC protocol, and configures 4 USB camera resolutions, which are:
1. YUV1280x720; (double camera working mode)
2. YUV1200x640; (double camera working mode)
3. YUV1280x962; (Single camera working mode)
4. MJPEG1280x960; (Single camera working mode)
The common camera only outputs two kinds of image data, namely YUV (raw data) format and MJPEG (joint photographic experts group) format, when the YUV format is set, the output frame rate is very low when the resolution is high due to the large data volume of the YUV format, but the MJPEG format is subjected to quantization and coding operations, the data volume is small, the output frame rate is high, and the image quality is poor relative to the YUV format. In actual use, the host end upper computer B can select a specific resolution, and open the lower computer software program a through the uvc protocol, and the lower computer software program a determines the current working mode according to the resolution and executes subsequent steps according to the working mode.
102. If the working mode is a double-camera working mode, the lower computer controls the TOF camera and the RGB camera to be started, and obtains a depth map output by the TOF camera and a color map output by the RGB camera;
if the upper computer specifies that the working mode of the lower computer is a double-camera working mode, the lower computer controls the TOF camera and the RGB camera which are accessed to be started and respectively receives the depth map output by the TOF camera and the color map output by the RGB camera.
It should be noted that, the depth map format output by the TOF camera is unified to YUV format, but the color map format output by the RGB camera is YUV format, the lower computer may also encode the color map in YUV format to obtain the color map in MJPEG format, and specifically, the format of the color map output by the lower computer at last may be determined by the resolution selected by the upper computer.
For example, in the example illustrated in step 101, YUV1280x720 and YUV1200x640 correspond to the dual-camera operating mode in the present application, specifically, if the upper computer specifies a resolution of 1280x720, the data format and content output by the lower computer are a YUV depth map of 640x480 size and a YUV color map of 640x480 size; if the upper computer specifies a resolution of 1200x640, the data format and content output by the lower computer are a YUV depth map of 640x480 and an MJPEG color map of 640x 480.
103. The lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map;
in the lower computer, the depth map and the color map output by the TOF camera and the RGB camera both carry a time stamp, and the lower computer can synchronize the depth map and the color map by using the time stamp to obtain a synchronized target depth map and a synchronized target color map.
104. And the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and outputs data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
The lower computer fills the synchronized target depth map and target color map into UVC _ buffer, namely the UVC data transmission buffer area in the application, and when UVC _ buffer is filled, the lower computer outputs data in UVC _ buffer to the upper computer through UVC protocol, and at the moment, the upper computer can obtain the frame-synchronized depth map and color map.
In the embodiment, the TOF camera and the RGB camera are connected to the lower computer, and the working mode of the lower computer is specified by the upper computer when in use, when the working mode is a dual-camera working mode, the lower computer controls the TOF camera and the RGB camera to be opened, and respectively obtains the depth map output by the TOF camera and the color map output by the RGB camera, and then the lower computer performs frame synchronization on the obtained depth map and color map through timestamp information, and transmits the frame-synchronized depth map and color map to the upper computer through a UVC protocol, so that the upper computer can directly obtain the frame-synchronized depth map and color map without changing a transmission protocol, thereby greatly saving the computing resources of the upper computer, simplifying the frame synchronization process of the RGBD camera, and improving the frame synchronization of the RGBD camera.
Furthermore, a plurality of USB camera resolutions are configured in advance on the lower computer, so that the upper computer can directly select the required resolution, the lower computer determines the data format and the content output by the TOF camera and the RGB camera respectively according to the resolution, the resolution and the image format of the RGB image and the depth image are flexibly matched, the upper computer does not need to respectively designate the resolutions of two modules, only one resolution is required to be designated by the upper computer, the lower computer completes the control of the TOF camera and the RGB camera and the frame synchronization of the depth image and the color image, the operation flow is simple and convenient, and the use and the frame synchronization process of the RGBD camera are further simplified.
Referring to fig. 2, fig. 2 is a diagram illustrating an embodiment of a method for outputting image data according to the present application, where the method for synchronizing a depth map and a color map in the method for outputting image data includes:
201. the lower computer obtains the current working mode of the upper computer;
in this embodiment, step 201 is similar to step 101 of the previous embodiment, and is not described herein again.
202. If the working mode is a double-camera working mode, the lower computer controls the TOF camera and the RGB camera to be started, and obtains a phase diagram output by the TOF camera and a color diagram output by the RGB camera;
in this embodiment, if the upper computer specifies that the working mode of the lower computer is a dual-camera working mode, the lower computer controls the TOF camera and the RGB camera to be turned on, and respectively receives the phase diagram output by the TOF camera and the color diagram output by the RGB camera.
203. The lower computer converts the phase diagram into a depth diagram and an infrared diagram;
and after the lower computer receives the phase diagram data output by the TOF camera, the phase diagram data is input into a DSP module in the lower computer, namely the phase diagram is converted into a depth diagram and an infrared diagram in a processor through an algorithm. In this embodiment, the TOF camera uses an ietf (index-TOF) technique, where the ietf refers to emitting modulated infrared light by an infrared light emitter, irradiating a scene with the modulated light, measuring a phase delay of the return light reflected by an object in the scene, and measuring a distance indirectly by using an orthogonal sampling technique after obtaining the phase delay, that is, obtaining depth information of the scene. The iTOF can be further divided into a Continuous Wave (CW) modulation and demodulation mode and a Pulse Modulated (PM) modulation and demodulation mode according to different modulation and demodulation types, the Continuous Wave modulation generally modulates a transmitted light Wave rate spectrum into a square Wave with variable intensity, a demodulation end detects waveform phase change after reflection of a target object, the measurement method firstly binds light flight distance information and phase information of the light intensity change, and then converts the phase information into light intensity information detectable by a photoelectric detector, and measurement of light flight time is indirectly realized. The depth data output in the iTOF technology is obtained through phase-unwrapping calculation, so that the lower computer can convert the phase image output by the TOF camera into a depth image and an infrared image by processing the phase image. That is, in this embodiment, the lower computer can not only output the depth map to the upper computer, but also output the corresponding infrared map to the upper computer.
204. The lower computer stores the depth map and the corresponding time stamp into a first cache queue and stores the color map and the corresponding time stamp into a second cache queue;
for subsequent frame synchronization, the lower computer stores the depth map obtained through phase map conversion and corresponding timestamp information into a depth map cache queue DepthQueue in the lower computer, namely a first cache queue in the application.
And the lower computer stores the color image output by the RGB camera and the corresponding timestamp information into a color image cache queue RGBqueue, namely a second cache queue in the application.
205. The lower computer determines the color image positioned at the head of the second cache queue as a target color image;
the lower computer monitors the RGBHuee in real time, when color image data exist in the RGBHuee, the RGBHuee (0) positioned at the head of the queue is determined as a target color image, and a time stamp of the target color image is read.
206. The lower computer matches a synchronous target depth map in a first cache queue according to the timestamp of the target color map;
and matching the lower computer in the DepthQueue according to the timestamp of the target color image to find a synchronous target depth image.
Specifically, the lower computer traverses the DepthQueue from left to right, compares the timestamp of the target color image with the timestamp in each item of data in the DepthQueue, calculates a timestamp difference (absolute value) between the timestamp of the target color image and the depth map in the DepthQueue, finds the depth map DepthQueue (n) with the timestamp difference smaller than a preset value, and determines the depth map as the target depth map, that is, the RGBQueue (0) and the DepthQueue (n) are considered as synchronous frames.
It should be noted that the preset value may be specifically set according to different devices, for example, the preset value is set to 40ms, and the lower computer finds the target depth map DepthQueue (n) with a timestamp difference within 40ms according to the timestamp of the target color map RGBQueue (0).
207. The lower computer releases data positioned in front of the target depth map in the first cache queue;
after determining the synchronized target color image RGBQueue (0) and the target depth map DepthQueue (n), the lower computer releases the data of the RGBQueue (0), moves the data in the RGBQueue (1) to the RGBQueue (0), but for the DepthQueue, the lower computer releases the data of the DepthQueue (0) to the DepthQueue (n-1), namely releases all the data in the first cache queue before the target depth map, moves the data in the DepthQueue (n) to the DepthQueue (0), performs the next frame synchronization, and sequentially circulates.
It should be noted that the reason why the data of DepthQueue (0) to DepthQueue (n-1) is released without releasing the data of DepthQueue (n) is as follows: the frame rate of the TOF camera is generally lower than that of the RGB camera, and the difference value between the time stamps of the color image of two frames and the depth image of the same frame is smaller than a preset value, so that the uvc _ buffer frame rate capable of guaranteeing output to the maximum extent is released and is not influenced by frame synchronization.
It should be noted that the frame synchronization method described in steps 204 to 206 in this embodiment is not only applicable to a scenario using the ietf technique, but also applicable to a scenario using dTOF (direct-TOF), where dTOF is to directly measure the optical flight time in the TOF technique, and the use of different TOF techniques does not affect the result of frame synchronization.
208. The lower computer determines a target infrared image according to the target depth image, and the time stamps of the target infrared image and the target depth image are the same;
in the present embodiment, after determining the target depth map synchronized with the target color map, the lower computer determines the target infrared map having the same time stamp directly from the time stamp of the target depth map. In some specific embodiments, in step 204, the lower computer may directly store the depth map and the infrared map together in the first buffer queue DepthQueue, and synchronize together.
209. And the lower computer caches the target depth image, the target infrared image and the target color image to the UVC data transmission buffer area, and outputs data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
The lower computer fills the synchronized target depth map, target infrared map and target color map into UVC _ buffer, namely the UVC data transmission buffer area in the application, and when UVC _ buffer is filled, the lower computer outputs the data in UVC _ buffer to the upper computer through UVC protocol, and at the moment, the upper computer can obtain the depth map, the infrared map and the color map which are subjected to frame synchronization.
It should be noted that, the depth map format output by the TOF camera is unified into a YUV format, but the color map format output by the RGB camera may be a YUV format, and the lower computer may further encode the color map in the YUV format to obtain a color map in an MJPEG format (MJPEG encoding needs to be performed before synchronization). The format of the color image finally output by the lower computer can be determined by the resolution selected by the upper computer. For example:
taking the resolution of 1280x720 specified by the upper computer as an example: the corresponding uvc _ buffer is YUV1280x720, the lower computer stores the target depth map of 640x480YUV format in DepthQueue (n) into the first 640x480 position of uvc _ buffer, stores the target infrared map of 640x480YUV format into the second 640x480 position of uvc _ buffer, and stores the target color map of 640x480YUV format in RGBQueue (0) into the third position of uvc _ buffer. The upper computer can obtain a YUV depth map with a data format and content of 640x480, a YUV infrared map with a size of 640x480 and a YUV color map with a size of 640x 480.
Taking the upper computer to specify the resolution of 1200x640 as an example: the corresponding uvc _ buffer is YUV1200x640, the lower computer stores the target depth map of 640x480YUV format in DepthQueue (n) into the first 640x480 position of uvc _ buffer, stores the target infrared map of 640x480YUV format into the second 640x480 position of uvc _ buffer, and stores the target color map of 640x480 peg format in RGBQueue (0) into the third position of uvc _ buffer. The upper computer can obtain a YUV depth map with the data format and content of 640x480, a YUV infrared map with the data format and content of 640x480 and an MJPEG color map with the data format and content of 640x 480.
In the embodiment, the TOF camera and the RGB camera are connected to the lower computer, and the working mode of the lower computer is specified by the upper computer when in use, when the working mode is a dual-camera working mode, the lower computer controls the TOF camera and the RGB camera to be opened, and respectively obtains the depth map output by the TOF camera and the color map output by the RGB camera, and then the lower computer performs frame synchronization on the obtained depth map and color map through timestamp information, and transmits the frame-synchronized depth map and color map to the upper computer through a UVC protocol, so that the upper computer can directly obtain the frame-synchronized depth map and color map without changing a transmission protocol, thereby greatly saving the computing resources of the upper computer, simplifying the frame synchronization process of the RGBD camera, and improving the frame synchronization of the RGBD camera.
In this embodiment, a plurality of USB camera resolutions are configured in advance on the lower computer, so that the upper computer can directly select a required resolution, and the lower computer determines the data format and content output by the TOF camera and the RGB camera according to the resolution, thereby flexibly collocating the resolutions of the RGB image and the depth map and the map format, without the need of respectively specifying the resolutions of the two modules by the upper computer, and the operation process is simple. And when frame synchronization is carried out, because the frame rate of the TOF camera is lower than that of the RGB camera, all data in the first buffer queue before the target depth map is released after each frame synchronization is carried out, and then the next frame synchronization is carried out, so that the uvc _ buffer frame rate which can maximally guarantee output is not influenced by the frame synchronization.
Referring to fig. 3, fig. 3 is another embodiment of the method for outputting image data provided by the present application, where the method includes:
301. the lower computer obtains the current working mode of the upper computer;
in this embodiment, step 301 is similar to step 101 of the previous embodiment, and is not described herein again.
302. If the working mode is a single-camera working mode, the lower computer determines the current resolution of the upper computer;
under single camera mode, the operation can be opened to the next computer independent control TOF camera, or the operation is opened to the independent control RGB camera, and specifically it can judge according to the resolution ratio that the host computer appointed to open which camera.
303. The lower computer controls the opening of the TOF camera or the opening of the RGB camera according to the resolution;
for example, referring to the example listed in step 101, the lower computer is respectively connected to one 1280x962 TOF camera and one 1280x960 RGB camera, and if the upper computer specifies that the resolution is 1280x962, the lower computer determines to turn on the TOF camera, and if the upper computer specifies that the resolution is 1280x960, the lower computer determines to turn on the RGB camera.
304. If the lower computer controls the TOF camera to be started, acquiring a phase diagram output by the TOF camera and outputting the phase diagram to the upper computer;
if the resolution ratio appointed by the upper computer is the resolution ratio of the TOF camera, the lower computer controls the TOF camera to be started, receives a phase diagram in a YUV format output by the TOF camera, outputs the phase diagram to the upper computer through a UVC protocol for secondary development, for example, sends the phase diagram to the upper computer so that the upper computer can calibrate the RGBD camera.
It should be noted that, in other specific embodiments, the lower computer may also convert the phase map into a depth map and/or an infrared map, and output the depth map and/or the infrared map to the upper computer, so that the upper computer directly uses the depth map and/or the infrared map.
For example, if the upper computer specifies the resolution to be 1280x962, the lower computer starts the TOF camera and outputs a YUV format phase map of 1280x962 size to the upper computer.
305. And if the lower computer controls the RGB camera to be started, acquiring the color image output by the RGB camera and outputting the color image to the upper computer.
If the resolution ratio appointed by the upper computer is the resolution ratio of the RGB camera, the lower computer controls the RGB camera to be started, receives the color image in the YUV format output by the RGB camera, then carries out MJPEG coding on the color image in the YUV format through a VGS module, namely an image processing module, and outputs the coded MJPEG color image to the upper computer through a UVC protocol.
For example, if the upper computer specifies a resolution of 1280x960, the lower computer turns on the RGB camera and outputs an MJPEG format color map of 1280x960 size to the upper computer.
In this embodiment, the lower computer selects to independently start the TOF camera or the RGB camera according to the working mode (resolution) specified by the upper computer, so as to realize the independent operation of the TOF camera or the RGB camera, and meet more application scenes.
Referring to fig. 4, fig. 4 is a diagram illustrating an embodiment of a lower computer provided in the present application, where the lower computer includes:
an obtaining unit 401, configured to obtain a current working mode of the upper computer;
the first control unit 402 is configured to control the TOF camera and the RGB camera to be turned on when the working mode is the dual-camera working mode, and obtain a depth map output by the TOF camera and a color map output by the RGB camera;
a synchronizing unit 403, configured to compare the timestamp of the depth map with the timestamp of the color map by the lower computer, and determine a synchronized target depth map and a synchronized target color map;
the first output unit 404 is configured to cache the target depth map and the target color map in the UVC data transmission buffer, and output data in the UVC data transmission buffer to the upper computer through the UVC protocol.
Optionally, the synchronization unit 403 includes:
a storing module 4031, configured to store the depth map and the corresponding timestamp in a first buffer queue, and store the color map and the corresponding timestamp in a second buffer queue;
a determining module 4032, configured to determine the color map located at the head of the second buffer queue as a target color map;
the matching module 4043 is configured to match the timestamp of the target color map with the synchronized target depth map in the first buffer queue.
Optionally, the matching module 4043 is specifically configured to:
the lower computer traverses the first cache queue from left to right, and calculates the difference value of the timestamp of the target color image and the timestamp of the depth image in the first cache queue;
and determining the depth map with the timestamp difference value smaller than the preset value as a target depth map.
Optionally, the synchronization unit 403 further includes:
a releasing module 4044, configured to release data located before the target depth map in the first buffer queue.
Optionally, the first control unit 402 is specifically configured to:
acquiring a phase diagram output by a TOF camera;
the phase map is converted into a depth map and an infrared map.
Optionally, the synchronization unit 403 is further configured to:
determining a target infrared image according to the target depth image, wherein the target infrared image and the target depth image have the same time stamp;
the first output unit 404 is specifically configured to:
and caching the target depth image, the target infrared image and the target color image into a UVC data transmission buffer area.
Optionally, the first control unit 402 is further configured to:
determining the current resolution of the upper computer;
the data content and format of the depth map and color map are determined according to the resolution.
Optionally, the depth map is in YUV format, and the color map is in YUV format or MJPEG format.
Optionally, the lower computer further includes:
the second control unit 405 is configured to determine a current resolution of the upper computer when the working mode is the single-camera working mode, and control the TOF camera to be turned on or control the RGB camera to be turned on according to the resolution;
the second output unit 406 is used for acquiring a phase diagram output by the TOF camera when the second control unit controls the TOF camera to be started, and outputting the phase diagram to the upper computer;
and the third output unit 407 is used for acquiring a color image output by the RGB camera when the second control unit controls the RGB camera to be started, and outputting the color image to the upper computer.
In the device of this embodiment, the functions of each unit and each module correspond to the steps in the method embodiments shown in fig. 1, 2, or 3, and are not described herein again.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an embodiment of an apparatus for outputting image data according to the present application, where the apparatus includes:
a processor 501, a memory 502, an input/output unit 503, and a bus 504;
the processor 501 is connected with the memory 502, the input/output unit 503 and the bus 504;
the memory 502 holds a program, and the processor 501 calls the program to execute any of the above methods of image data output.
The present application also relates to a computer-readable storage medium having a program stored thereon, wherein the program, when executed on a computer, causes the computer to perform any of the above methods of image data output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application, which are essential or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.

Claims (13)

1. A method of image data output, the method comprising:
the lower computer obtains the current working mode of the upper computer;
if the working mode is a double-camera working mode, the lower computer controls a TOF camera and an RGB camera to be started, and obtains a depth map output by the TOF camera and a color map output by the RGB camera;
the lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map;
and the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and outputs data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
2. The method according to claim 1, wherein the lower computer determines a synchronized target depth map and a target color map by comparing the time stamp of the depth map and the time stamp of the color map, and comprises:
the lower computer stores the depth map and the corresponding timestamps into a first cache queue, and stores the color map and the corresponding timestamps into a second cache queue;
the lower computer determines the color image positioned at the head of the second cache queue as a target color image;
and the lower computer matches the synchronous target depth map in the first cache queue according to the timestamp of the target color map.
3. The method as claimed in claim 2, wherein the lower computer matching the synchronized target depth map in the first buffer queue according to the timestamp of the target color map comprises:
traversing the first cache queue from left to right by the lower computer, and calculating a time stamp difference value between the time stamp of the target color image and the time stamp of the depth image in the first cache queue;
and determining the depth map with the timestamp difference value smaller than a preset value as a target depth map.
4. The method in accordance with claim 2, after the lower computer matches a synchronized target depth map in the first buffer queue according to timestamps of the target color map, the method further comprising:
and the lower computer releases the data in the first buffer queue before the target depth map.
5. The method of claim 1, wherein the lower computer obtaining the depth map output by the TOF camera comprises:
the lower computer obtains a phase diagram output by the TOF camera;
and the lower computer converts the phase diagram into a depth diagram and an infrared diagram.
6. The method in accordance with claim 5, wherein after the lower computer determines synchronized target depth map and target color map according to the comparison of the first time stamp information of the depth map and the second time stamp information of the color map, the method further comprises:
the lower computer determines a target infrared image according to the target depth image, and the time stamps of the target infrared image and the target depth image are the same;
the lower computer caches the target depth map and the target color map to a UVC data transmission buffer area, and the method comprises the following steps:
and the lower computer caches the target depth map, the target infrared map and the target color map to a UVC data transmission buffer area.
7. The method according to claim 1, wherein after the lower computer obtains the current operating mode of the upper computer, the method further comprises:
the lower computer determines the current resolution of the upper computer;
and the lower computer determines the data content and the format of the depth map and the color map according to the resolution.
8. The method of claim 1, wherein the depth map is in YUV format and the color map is in YUV format or MJPEG format.
9. The method according to any one of claims 1 to 8, wherein after the lower computer acquires the current operating mode of the upper computer, the method comprises:
if the working mode is a single-camera working mode, the lower computer determines the current resolution of the upper computer;
the lower computer controls the opening of the TOF camera or controls the opening of the RGB camera according to the resolution;
if the lower computer controls the TOF camera to be started, acquiring a phase diagram output by the TOF camera and outputting the phase diagram to the upper computer;
and if the lower computer controls the RGB camera to be started, acquiring the color image output by the RGB camera and outputting the color image to the upper computer.
10. The method according to claim 9, wherein the phase map is in YUV format and the color map is in MJPEG format.
11. The lower computer is characterized by comprising:
the acquisition unit is used for acquiring the current working mode of the upper computer;
the control unit is used for controlling the TOF camera and the RGB camera to be started when the working mode is a double-camera working mode, and acquiring a depth map output by the TOF camera and a color map output by the RGB camera;
the lower computer compares the time stamp of the depth map with the time stamp of the color map to determine a synchronous target depth map and a synchronous target color map;
and the output unit is used for caching the target depth map and the target color map to a UVC data transmission buffer area and outputting data in the UVC data transmission buffer area to the upper computer through a UVC protocol.
12. An apparatus for image data output, the apparatus comprising:
the device comprises a processor, a memory, an input and output unit and a bus;
the processor is connected with the memory, the input and output unit and the bus;
the memory holds a program that the processor calls to perform the method of any one of claims 1 to 10.
13. A computer-readable storage medium having a program stored thereon, which when executed on a computer performs the method of any one of claims 1 to 10.
CN202211335128.4A 2022-10-28 2022-10-28 Image data output method and device, lower computer and storage medium Pending CN115883815A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211335128.4A CN115883815A (en) 2022-10-28 2022-10-28 Image data output method and device, lower computer and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211335128.4A CN115883815A (en) 2022-10-28 2022-10-28 Image data output method and device, lower computer and storage medium

Publications (1)

Publication Number Publication Date
CN115883815A true CN115883815A (en) 2023-03-31

Family

ID=85759100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211335128.4A Pending CN115883815A (en) 2022-10-28 2022-10-28 Image data output method and device, lower computer and storage medium

Country Status (1)

Country Link
CN (1) CN115883815A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108989606A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium
CN110312056A (en) * 2019-06-10 2019-10-08 青岛小鸟看看科技有限公司 A kind of synchronous exposure method and image capture device
CN110460824A (en) * 2019-07-03 2019-11-15 青岛小鸟看看科技有限公司 A kind of frame synchornization method and camera of image data
CN113052884A (en) * 2021-03-17 2021-06-29 Oppo广东移动通信有限公司 Information processing method, information processing apparatus, storage medium, and electronic device
CN113534596A (en) * 2021-07-13 2021-10-22 盛景智能科技(嘉兴)有限公司 RGBD stereo camera and imaging method
CN114945072A (en) * 2022-04-20 2022-08-26 优利德科技(中国)股份有限公司 Dual-camera frame synchronization processing method and device, user terminal and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108989606A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium
CN110312056A (en) * 2019-06-10 2019-10-08 青岛小鸟看看科技有限公司 A kind of synchronous exposure method and image capture device
CN110460824A (en) * 2019-07-03 2019-11-15 青岛小鸟看看科技有限公司 A kind of frame synchornization method and camera of image data
CN113052884A (en) * 2021-03-17 2021-06-29 Oppo广东移动通信有限公司 Information processing method, information processing apparatus, storage medium, and electronic device
CN113534596A (en) * 2021-07-13 2021-10-22 盛景智能科技(嘉兴)有限公司 RGBD stereo camera and imaging method
CN114945072A (en) * 2022-04-20 2022-08-26 优利德科技(中国)股份有限公司 Dual-camera frame synchronization processing method and device, user terminal and storage medium

Similar Documents

Publication Publication Date Title
US6704042B2 (en) Video processing apparatus, control method therefor, and storage medium
CN104270567B (en) High-precision synchronous multi-channel image acquisition system and time synchronization method thereof
CN110312056B (en) Synchronous exposure method and image acquisition equipment
US10951804B2 (en) Photographing synchronization method and apparatus
CN112153306B (en) Image acquisition system, method and device, electronic equipment and wearable equipment
CN105338323A (en) Video monitoring method and device
WO2018015806A1 (en) System and method providing object-oriented zoom in multimedia messaging
CN105657403B (en) The synchronization system of structured light projection and Image Acquisition based on FPGA
US20240104770A1 (en) Three-dimensional scanning system and three-dimensional scanning method
US9992446B2 (en) Image transmission system and image transmission method
CN114554250B (en) Video and position synchronization method of unmanned aerial vehicle or unmanned aerial vehicle
CN112565224A (en) Video processing method and device
CN105959562A (en) Method and device for obtaining panoramic photographing data and portable panoramic photographing equipment
CN115883815A (en) Image data output method and device, lower computer and storage medium
CN114845004A (en) Audio and video synchronization implementation method and acoustic imaging method
CN113225152B (en) Method and device for synchronizing cameras and computer readable medium
CN108737809B (en) Remote synchronous image acquisition method
CN115904281A (en) Cloud desktop conference sharing method, server and computer readable storage medium
CN109510998B (en) The method for obtaining unpressed IP Camera initial data
WO2022157105A1 (en) System for broadcasting volumetric videoconferences in 3d animated virtual environment with audio information, and method for operating said system
CN114283241A (en) Structured light three-dimensional reconstruction device and method
CN113497883A (en) Image processing method and system, camera module and image acquisition system
CN109151435B (en) Data processing method, terminal, server and computer storage medium
CN110634564A (en) Pathological information processing method, device and system, electronic equipment and storage medium
CN111800600A (en) Internet of things field video monitoring system based on low power consumption

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination