WO2023093054A1 - Data processing method, apparatus, and system, device, and storage medium - Google Patents

Data processing method, apparatus, and system, device, and storage medium Download PDF

Info

Publication number
WO2023093054A1
WO2023093054A1 PCT/CN2022/103025 CN2022103025W WO2023093054A1 WO 2023093054 A1 WO2023093054 A1 WO 2023093054A1 CN 2022103025 W CN2022103025 W CN 2022103025W WO 2023093054 A1 WO2023093054 A1 WO 2023093054A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
time
row
data
image frame
Prior art date
Application number
PCT/CN2022/103025
Other languages
French (fr)
Chinese (zh)
Inventor
赖海斌
李清正
陈胜杰
石建萍
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Publication of WO2023093054A1 publication Critical patent/WO2023093054A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • H04N5/0675Arrangements or circuits at the transmitter end for mixing the synchronising signals with the picture signal or mutually

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to a data processing method, device, system, equipment, and storage medium.
  • Autonomous driving is an important research direction in artificial intelligence technology.
  • the autonomous driving system can detect environmental information through the fusion of data collected by multiple sensors for target detection and tracking.
  • each frame of sensor data is required to carry time stamp information.
  • the current system time can be added to the time tag of the corresponding data frame.
  • Embodiments of the present disclosure at least provide a data processing method, device, system, device, and storage medium.
  • an embodiment of the present disclosure provides a data processing method, the method comprising: in response to receiving a synchronous trigger signal from a positioning module, acquiring the image acquisition moment indicated by the synchronous trigger signal, and one or more The image exposure duration corresponding to the current image frame collected by one of the image sensors from the image acquisition moment; based on the image acquisition moment and the image exposure duration corresponding to the current image frame, determine the time stamp information of the current image frame ; Synchronously storing the current image frame and the timestamp information of the current image frame.
  • the embodiment of the present disclosure also provides a data processing device, including: an acquisition module, configured to acquire the image acquisition time indicated by the synchronization trigger signal in response to receiving the synchronization trigger signal of the positioning module, and the image The image exposure duration corresponding to the current image frame collected by the sensor since the image acquisition moment; a determining module, configured to determine the time of the current image frame based on the image acquisition moment and the image exposure duration corresponding to the current image frame Stamp information; a storage module, configured to store the current image frame and the time stamp information of the current image frame synchronously.
  • the embodiment of the present disclosure also provides a system for data processing, the system includes: a field programmable gate array FPGA computing unit and an ARM processor; the ARM processor is used to locate the global positioning system GPS
  • the image acquisition time indicated by the synchronous trigger signal generated by the unit and the image exposure time corresponding to the current image frame collected by the image sensor from the image acquisition time are read into the memory, and the image acquisition time and the current image
  • the image exposure duration corresponding to the frame is input to the FPGA operation unit; the FPGA operation unit is used to determine the timestamp information of the current image frame based on the image acquisition moment and the image exposure duration corresponding to the current image frame ; and synchronously storing the current image frame and the timestamp information of the current image frame into the memory.
  • an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory, and a bus, the memory stores machine-readable instructions executable by the processor, and when the electronic device is running, the The processor communicates with the memory through a bus, and when the machine-readable instructions are executed by the processor, the steps of the data processing method described in any one of the first aspect and its various implementation manners are executed.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, and the computer program is executed by a processor as in the first aspect and its various implementation modes A step in any of the described data processing methods.
  • FIG. 1 shows a flowchart of a data processing method provided by an embodiment of the present disclosure
  • FIG. 2 shows a flowchart of a specific method for determining timestamp information in a data processing method provided by an embodiment of the present disclosure
  • FIG. 3a shows a flow chart of a specific method for synchronous acquisition in a data processing method provided by an embodiment of the present disclosure
  • FIG. 3b shows a flow chart of a specific method for synchronous acquisition in a data processing method provided by an embodiment of the present disclosure
  • FIG. 4 shows an application schematic diagram of a data processing method provided by an embodiment of the present disclosure
  • FIG. 5 shows a schematic diagram of a data processing system provided by an embodiment of the present disclosure
  • FIG. 6 shows a schematic diagram of a data processing device provided by an embodiment of the present disclosure
  • Fig. 7 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
  • a and/or B may mean that A exists alone, A and B exist simultaneously, and B exists alone.
  • at least one herein means any one of a variety or any combination of at least two of the more, for example, including at least one of A, B, and C, which may mean including from A, Any one or more elements selected from the set formed by B and C.
  • each frame of sensor data is required to carry time stamp information.
  • the current system time can be added to the time tag of the corresponding data frame.
  • the present disclosure provides a data processing method, device, system, equipment and storage medium to determine a more accurate data time stamp.
  • the execution subject of the data processing method provided in the embodiment of the present disclosure is generally an electronic device with a certain computing capability.
  • the electronic device includes, for example: a terminal device, a server or other processing equipment, and the terminal device can be a user equipment (User Equipment, UE), a mobile device, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, a vehicle-mounted device , wearable devices, etc., and other processing devices can be custom development boards based on semi-custom circuits such as Field Programmable Gate Array (Field Programmable Gate Array, FPGA). Considering the excellent characteristics of the FPGA board, the FPGA board will be used as an example to illustrate.
  • UE User Equipment
  • PDA Personal Digital Assistant
  • FPGA Field Programmable Gate Array
  • FPGA Field Programmable Gate Array
  • the above data processing method may be implemented by a processor invoking computer-readable instructions stored in a memory.
  • the data processing method includes the following steps S101-S103.
  • S101 In response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition time indicated by the synchronous trigger signal, and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time;
  • S102 Determine the timestamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
  • S103 Synchronously store the current image frame and the timestamp information of the current image frame, and the timestamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
  • the data processing method in the embodiments of the present disclosure may be applied to the determination of the time stamp of the image collected by the image sensor.
  • the image sensor As an indispensable sensor in the field of autonomous driving, the image sensor, the accuracy of the time stamp information of the collected images provides a strong data support for the subsequent data fusion.
  • the time stamp information can be determined by software, that is, when the system receives sensor data, the current system time is added to the time tag of the corresponding data frame as the time stamp of the corresponding data frame. Time delay, there is a big error in this way.
  • the related technology also provides a hardware method to determine the time stamp information, that is, the hardware trigger signal is used to control each sensor to collect data at the same time.
  • This method can well control the synchronous collection between sensors
  • image sensors such as cameras
  • most cameras integrate automatic exposure (Automatic Exposure, AE) function.
  • AE Automatic Exposure
  • an embodiment of the present disclosure provides a data processing method for determining time stamp information based on image exposure time, so that the determined time stamp information is more accurate.
  • the positioning module in the embodiments of the present disclosure may be a positioning receiver set on the vehicle, such as a Global Positioning System (Global Positioning System, GPS) receiver, or a Global Navigation Satellite System (Global Navigation Satellite System) receiver. System, GNSS) receiver, or other positioning receivers, which are not specifically limited here.
  • a Global Positioning System Global Positioning System, GPS
  • GPS Global Positioning System
  • GNSS Global Navigation Satellite System
  • the image sensor in the embodiments of the present disclosure can be a camera installed on the vehicle to collect the surrounding environment of the vehicle.
  • the installation position of the camera can be configured according to different scene requirements. Two cameras are installed at each location; as another example, cameras can be installed only at the front of the vehicle, and the specific configuration method is not limited here.
  • the positioning module in the embodiment of the present disclosure can receive the high-precision time information based on the atomic clock sent by the satellite through the antenna, and output the data packet and the pulse per second (Pulse Per Second) format of the International Electrical Manufacturers Association (National Electrical Manufacturers Association, NEMA) format. , PPS) the whole second clock signal is given to the FPGA board.
  • PPS pulse per second
  • the data processing method In response to receiving the synchronous trigger signal of the positioning module, the data processing method provided by the embodiment of the present disclosure can obtain the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time.
  • the above-mentioned image acquisition time may indicate, under the trigger of the synchronous trigger signal, the trigger time corresponding to the rising edge of the image acquisition trigger signal for the image sensor, at which trigger time, the image sensor starts to perform the image acquisition operation.
  • the same trigger time can be preset for different image sensors to realize synchronous triggering of multiple image sensors, and different trigger times can also be preset to meet the acquisition requirements of other sensors.
  • the internal structure of the image sensor has multiple rows of photosensitive elements, such as a global shutter or a rolling shutter.
  • image exposure durations corresponding to different rows of image data (pixels) may be the same, therefore, the image exposure duration here may be an exposure duration for row image data.
  • the time stamp information of the current image frame can be determined based on the image acquisition time and the image exposure time corresponding to the current image frame, that is, when the image exposure time is different, the determined time stamp information is also different. In this way, the impact of the difference in exposure time on the accuracy of the time stamp can be resolved.
  • the fusion of the current image frame with the time stamp information and other collected data can be realized.
  • the other collected data here may be any data that is fused with the current image frame.
  • the above data fusion solution may be implemented based on an Industrial Personal Computer (IPC).
  • the stored current image frame and the timestamp information of the current image frame are acquired; based on the timestamp information of the current image frame, the current image frame is It is fused with the collected data to obtain fused data.
  • the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the acquisition of the current image frame can be obtained, and then based on the image acquisition time and the corresponding image
  • the exposure time can determine the timestamp information of the current image frame.
  • the fused data can also more accurately characterize the surrounding environment of autonomous driving, thereby guiding safer autonomous driving operations.
  • the timestamp determination process may include the following steps:
  • Step 1 Based on the image acquisition time and the preset time difference interval, determine the line exposure start and end time of the first line of image data in the current image frame;
  • Step 2 Determine the time stamp information of the current image frame based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data.
  • the time stamp information here can indicate the exposure center moment of the current image frame. This is mainly because the image center pixel corresponding to the exposure center moment is more likely to correspond to the object of interest in the image. In this way , when the exposure center moment is used as the timestamp information of the image frame, the accuracy of the timestamp can be significantly improved.
  • the internal synchronization mechanism of the image sensor can be left-aligned, that is, the time between the image acquisition trigger signal and the image exposure start moment is fixed (there is a preset time difference interval between the image acquisition trigger signal and the image exposure start moment) ;
  • the internal synchronization mechanism of the image sensor can also adopt a right alignment method, that is, the time between the image acquisition trigger signal and the image exposure termination moment is fixed (there is a preset time difference interval between the image acquisition trigger signal and the image exposure termination moment) .
  • the exposure start and end moments include the image exposure start moment or the image exposure end moment. For the sake of illustration, the following example is right-aligned.
  • some use simultaneous exposure of all rows of image data such as global shutter
  • some use an exposure method in which the exposure time difference between adjacent rows of image data is one time unit (such as rolling shutter)
  • the row exposure center moment of the middle row of image data can be determined in combination with the exposure mode of the corresponding image sensor, and this moment can be used as Timestamp information for image frames.
  • the schemes for determining the time stamp information of the current image frame can be specifically described for the above two exposure modes respectively.
  • the timestamp information of the current image frame can be determined according to the following steps:
  • Step 1 Obtain the row exposure center moment of the first row of image data based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding image exposure duration of the first row of image data;
  • Step 2 Determine the row exposure center moment of the first row of image data as the time stamp information of the current image frame.
  • the image exposure duration from the start of exposure to the center pixel of the first row of image data or from the center pixel of the first row of image data to the end of exposure is the entire half of the image exposure time, and then perform difference calculation between the row exposure start and end time of the first row of image data and half of the above image exposure duration to obtain the row exposure center time of the first row of image data.
  • the row exposure center moment of the middle row image data is the same as the row exposure center moment of the first row image data, so the row exposure center moment of the first row image data can be Determine the timestamp information of the current image frame.
  • the timestamp information of the current image frame can be determined according to the following steps:
  • Step 1 Obtain the row exposure center moment of the first row of image data based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding image exposure duration of the first row of image data;
  • Step 2 Determine the row exposure center moment of the middle row image data based on the row exposure center moment of the first row image data and the number of time units between the middle row image data and the first row image data in the current image frame;
  • Step 3 Determine the row exposure center moment of the middle row image data as the time stamp information of the current image frame.
  • the row exposure center moment of the first row of image data may be determined first. Since the exposure starting time of adjacent rows of image data in the current image frame differs by one time unit exposure, how many rows are different between the middle row image data and the first row of image data, and how many time units are the difference between the corresponding two row exposure center moments Therefore, the row exposure center moment of the middle row image data can be determined based on the row exposure center moment of the first row image data, and can be determined as the time stamp information of the current image frame.
  • the frame synchronizer (Frame SYNChronizer, FSYNC) can be provided by the synchronous trigger signal generation module in the FPGA board, and the frequency is based on the actual camera frame rate, which can be set to 30Hz here .
  • the phase locked loop phase locked loop
  • the vertical synchronizer (Vertical synchronizer, VSYNC) inside the image sensor is the same frequency and phase as the external input FSYNC.
  • the time interval between the rising edge of the image acquisition trigger signal and the readout moment of each row of image data (also the row exposure termination moment) is fixed.
  • t0 is the image acquisition moment indicated by the FSYNC synchronous trigger signal of the Nth image frame
  • t1 is the row exposure termination moment of the first row of image data
  • the time difference with the t0 moment is a constant T (readout)
  • t2 is the first At the center moment of the row exposure of the row image data, the row exposure duration of the current image frame is T(exposure).
  • the exposure time of adjacent rows differs by a time unit of 1H.
  • the resolution of the image is Wide x High
  • Wide represents the number of pixels in the horizontal direction in the image
  • High represents the image
  • the number of pixels in the vertical direction the time stamp information t of the corresponding current image frame at this time is:
  • the values of T (readout) and H can be determined according to specific sensors, and the value of T (exposure) can be obtained from the data stream of the Mobile Industry Processor Interface (mipi).
  • the time stamp information can be determined according to the above specific implementation methods based on different exposure methods, so that Assign precise timestamp information to each image frame.
  • the time stamp information of the current image frame is determined based on the start and end moments of row exposure of the first row of image data in the current image frame and the row exposure duration corresponding to the first row of image data, including: Determine the exposure mode of the image sensor; in response to determining that the exposure mode of the image sensor is simultaneous exposure of all row image data, based on the row exposure start and end times of the first row image data in the current image frame and the first row image data Corresponding to the image exposure duration, obtain the row exposure center moment of the first row of image data, and determine the row exposure center moment of the first row of image data as the timestamp information of the current image frame; in response to determining the The exposure method of the image sensor is that the row exposure start time difference of adjacent rows of image data is one time unit, based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding row exposure time, obtain the row exposure center moment of the first row image data, based on the row exposure center moment of the first row image data, and the
  • multiple image sensors can be combined to achieve image data fusion.
  • the multiple image sensors here can acquire image frames synchronously, specifically, the image acquisition moment indicated by the synchronous trigger signal can be determined in the following manner:
  • Step 1 In response to receiving the synchronous trigger signal of the positioning module, for each image sensor in the plurality of image sensors, generate an image acquisition trigger signal for triggering the image sensor to acquire the current image frame;
  • Step 2 Determine the trigger moment corresponding to the rising edge of the image acquisition trigger signal as the image acquisition moment of the image sensor.
  • an image acquisition trigger signal for triggering image frame acquisition of the image sensor may be generated based on the synchronous trigger signal. For each image sensor, an image acquisition trigger signal corresponding to the image sensor is generated.
  • Each image acquisition trigger signal is the same frequency and phase, that is, different image sensors are synchronously image acquisition, which can be adapted to visual perception In the main application scenario, multiple image sensors can be exposed at the same time.
  • the multiple image sensors in the embodiments of the present disclosure may collect image frames with a preset time interval, specifically, the image collection moment indicated by the synchronous trigger signal may be determined in the following manner:
  • Step 1 In response to receiving the synchronous trigger signal of the positioning module, based on the preset time interval between the multiple image sensors, the synchronous trigger signal is subjected to frequency conversion processing and/or phase-distortion processing to obtain the signals used to trigger multiple image sensors to respectively perform The image acquisition trigger signal of the current image frame acquisition; the time interval between the rising edges of the image acquisition trigger signals of multiple image sensors meets the preset time interval;
  • Step 2 For each image sensor in the plurality of image sensors, determine the trigger moment corresponding to the rising edge of the image acquisition trigger signal of the image sensor as the image acquisition moment of the image sensor.
  • the image acquisition trigger signal can be generated by frequency conversion or phase-distortion processing of the synchronous trigger signal, that is, different image sensors can perform image acquisition asynchronously, and can be adapted to application scenarios based on Lidar (Lidar) perception.
  • the image sensor needs to adjust the initial phase of the trigger signal according to the specific installation position to ensure that the image frame is aligned with the point cloud data.
  • the synchronous signal generation module (Trigger Module) in the FPGA board generates the trigger signal of the frequency required by the Camera and IMU through phase locking and frequency multiplication, and the phase error ⁇ 10ns.
  • FSYNC0 ⁇ FSYNC7 are used to control the synchronous acquisition of 8 cameras
  • FSYNC8 is used to trigger the synchronous acquisition of IMU, and the frequency/phase is variable.
  • the 8 image sensors have the same frequency and phase, which can be obtained by converting the frequency of the GNSS PPS signal, which will enable the 8 image sensors to be exposed at the same time.
  • related Lidar and IMU can also be obtained by PPS signal frequency conversion. This is only a specific example, and the adjustment of specific frequency conversion coefficients may be determined in combination with different application requirements, and details are not described here.
  • Camera0-Camera2 of the 8 image sensors work at the same frequency and phase, while related Camera3-Camera7 can have different frequencies and/or different phases, which will make the image sensor have a certain exposure
  • the time difference, the exposure time difference directly affects the image acquisition sequence of the image sensor.
  • the lidar adopts a rotating type
  • the above-mentioned 8 cameras can achieve image acquisition with a preset time interval.
  • the preset time interval can be determined through the following steps:
  • Step 1 Obtain the scanning range of the radar device; the radar device and the image sensor collect for the same scene;
  • Step 2 dividing the scanning range into a plurality of scanning sub-ranges; for each image sensor in the plurality of image sensors, the image sensor is responsible for collecting images in one scanning sub-range in the plurality of scanning sub-ranges;
  • Step 3 Based on the scanning time of the radar device for each scanning sub-range in the multiple scanning sub-ranges, determine a preset time interval between two adjacent image sensors among the multiple image sensors.
  • the radar device and the image sensor here can collect for the same scene.
  • radar equipment can be installed on self-driving vehicles together with image sensors. In this way, based on the corresponding relationship between each scanning sub-range of the radar device and each image sensor, the preset time interval between two adjacent image sensors can be determined, so that the installation position of each image sensor can be determined, which is convenient for scene deploy.
  • the preset time interval between the image sensors is determined in combination with the scanning range of the radar device, that is, the acquisition operation of multiple image sensors can be triggered with the scanning sub-range of the radar device, thereby realizing the communication between the radar device and multiple image sensors Synchronous acquisition, better adaptability.
  • the attitude capture time of the target attitude data captured by the inertial measurement device in response to receiving the synchronous trigger signal of the positioning module, can be obtained, and the attitude capture time of the target attitude data is determined as the time stamp of the target attitude data information, and synchronously store the target attitude data and the time stamp information of the target attitude data.
  • data fusion can be realized based on the proximity between the timestamp information of the target pose data and the timestamp information of the image frame, and the fused data is more in line with current application scenarios.
  • the point cloud scanning time of the point cloud data scanned by the radar device in response to receiving the synchronous trigger signal of the positioning module, can also be obtained, and the point cloud scanning time is determined as the time stamp information of the point cloud data In this case, the point cloud data and the time stamp information of the point cloud data are stored synchronously.
  • the determined point cloud scanning time may be determined based on different timing schemes.
  • PTP Precision Time Protocol
  • the trigger moment corresponding to the synchronous trigger signal can be determined as the point cloud scanning moment; or, the radar device can synchronize the first timing unit included in the radar device with the second timing unit included in the Ethernet controller in real time
  • the timing time of the second timing unit is obtained, and the timing time of the second timing unit is used as the point cloud scanning time.
  • the point cloud scanning time here can be the point cloud corresponding to a certain phase during the point cloud scanning process. scan time.
  • the timing time of the second timing unit is the start timing time recorded by the second timing unit when the trigger moment corresponding to the synchronization trigger signal is synchronized to the second timing unit.
  • the timing time related to the second timing unit can be realized based on the synchronization of the system time. A specific example may be described in conjunction with FIG. 4 .
  • the FPGA After the FPGA receives the NEMA format data packet and PPS signal output by the GNSS positioning module, it is divided into two physical links: one link is directly transmitted to Lidar0, which is used as the GPS synchronization input of Lidar0 ( Corresponding to the above-mentioned first synchronization method); the other link is given to the ARM module in the SoC (System on Chip) chip, and after receiving the GPS time information through the Chrony tool, it is synchronized to the system time (System Time) .
  • Lidar0 which is used as the GPS synchronization input of Lidar0 ( Corresponding to the above-mentioned first synchronization method)
  • SoC System on Chip
  • the embodiments of the present disclosure can also build an NTP server (Network Time Protocol Server) through the Chrony tool to serve time to the IPC (IP Camera), and the IPC can realize data fusion based on different fusion requirements.
  • NTP server Network Time Protocol Server
  • IP Camera IP Camera
  • the trigger module can use the rising edge of the TrigX signal with the same frequency and phase as FSYNC to trigger the second timing unit in the GEM, and read the timing time in the second timing unit through the internal high-speed AXI (Advanced eXtensible Interface) bus.
  • AXI Advanced eXtensible Interface
  • the time synchronization of the radar equipment mentioned above can be implemented based on ARM, and in practical applications, it can also be implemented on an FPGA board, and no specific limitation is set here.
  • the embodiment of the present disclosure also provides a data processing system, as shown in FIG. 5 , which may include: an FPGA computing unit 501 and an ARM processor 502 .
  • the ARM processor 502 is used to read the image acquisition time indicated by the synchronous trigger signal generated by the global positioning system GPS positioning unit and the image exposure time corresponding to the current image frame collected by the image sensor from the image acquisition time into the memory, and image The acquisition time and the image exposure duration corresponding to the current image frame are input to the FPGA operation unit 501;
  • the FPGA operation unit 501 is used to determine the time stamp information of the current image frame based on the image acquisition time and the corresponding image exposure time of the current image frame; and store the current image frame and the time stamp information of the current image frame into the memory synchronously, The time stamp information of the frame is used as a reference time when fusing the current image frame with other acquired data.
  • the writing order of each step does not mean a strict execution order and constitutes any limitation on the implementation process.
  • the specific execution order of each step should be based on its function and possible
  • the inner logic is OK.
  • the embodiment of the present disclosure also provides a data processing device corresponding to the data processing method. Since the problem-solving principle of the device in the embodiment of the present disclosure is similar to the above-mentioned data processing method of the embodiment of the present disclosure, therefore For the implementation of the device, reference may be made to the implementation of the method, and repeated descriptions will not be repeated.
  • the device includes: an acquisition module 601, a determination module 602, and a storage module 603; wherein,
  • the acquisition module 601 is configured to, in response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition moment indicated by the synchronous trigger signal, and the image exposure corresponding to the current image frame acquired by one or more image sensors since the image acquisition moment duration;
  • a determining module 602 configured to determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
  • the storage module 603 is configured to store the current image frame and the time stamp information of the current image frame synchronously, and the time stamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
  • the above data processing device in the case of responding to the synchronous trigger signal of the positioning module, it can obtain the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the acquisition of the current image frame, and then based on the image acquisition time and the corresponding image
  • the exposure time can determine the timestamp information of the current image frame.
  • This disclosure realizes the determination of time stamp information in combination with image exposure attributes.
  • the time stamps determined by image sensors with different exposure attributes are also different, which makes the determined time stamp information more suitable for the exposure time of the captured target to a certain extent. In order to meet the needs of the actual scene.
  • the determination module 602 is configured to determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame according to the following steps:
  • the time stamp information of the current image frame is determined based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data.
  • the determination module 602 is configured to follow the steps below based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding The image exposure time corresponding to the first line of image data determines the timestamp information of the current image frame:
  • the row exposure center moment of the first row of image data is obtained;
  • the row exposure center moment of the first row of image data is determined as the time stamp information of the current image frame.
  • the determining module 602 is configured to follow the steps below based on the first row in the current image frame
  • the time stamp information of the current image frame is determined by the start and end time of the row exposure corresponding to the image data and the image exposure duration corresponding to the first row of image data:
  • the row exposure center moment of the first row of image data is obtained;
  • the row exposure center moment of the intermediate row image data is determined as the time stamp information of the current image frame.
  • one person or multiple image sensors include multiple image sensors, and image acquisition is performed synchronously between the multiple image sensors; the acquisition module 601 is configured to respond to receiving the synchronization of the positioning module according to the following steps Trigger signal to obtain the image acquisition moment indicated by the synchronous trigger signal:
  • the trigger moment corresponding to the rising edge of the image acquisition trigger signal is determined as the image acquisition moment of the image sensor.
  • the one or more image sensors include a plurality of image sensors, and image acquisition is performed with a preset time interval between the plurality of image sensors; the acquisition module 601 is configured to respond to receiving the positioning according to the following steps: Synchronous trigger signal of the module, to obtain the image acquisition time indicated by the synchronous trigger signal:
  • the synchronous trigger signal In response to receiving the synchronous trigger signal of the positioning module, based on the preset time interval between the multiple image sensors, the synchronous trigger signal is subjected to frequency conversion processing and/or phase-changing processing to obtain the current image frame used to trigger the multiple image sensors respectively
  • the trigger moment corresponding to the rising edge of the image acquisition trigger signal of the image sensor is determined as the image acquisition moment of the image sensor.
  • the acquisition module 601 is configured to determine a preset time interval between multiple image sensors according to the following steps:
  • the image sensor is responsible for collecting images in one scan sub-range of the multiple scan sub-ranges;
  • a preset time interval between two adjacent image sensors among the plurality of image sensors is determined.
  • the acquisition module 601 is further configured to acquire the attitude capture moment of the target attitude data captured by the inertial measurement device in response to receiving the synchronous trigger signal of the positioning module;
  • the storage module 603 is also used to determine the gesture capture time of the target gesture data as the timestamp information of the target gesture data, and to store the target gesture data and the timestamp information of the target gesture data synchronously, and the timestamp information of the target gesture data is used for as the reference time when fusing the target pose data with the current image frame.
  • the acquiring module 601 is further configured to acquire the point cloud scanning time of the point cloud data scanned by the radar device in response to receiving the synchronous trigger signal of the positioning module;
  • the storage module 603 is also used to determine the point cloud scanning time as the time stamp information of the point cloud data; the point cloud data and the time stamp information of the point cloud data are stored synchronously, and the time stamp information of the point cloud data is used as The reference time when point cloud data is fused with the current image frame.
  • the acquisition module 601 is configured to acquire the point cloud scanning time of the point cloud data scanned by the radar device according to the following steps:
  • the timing time of the second timing unit is obtained, and the timing time of the second timing unit is determined as a point Cloud scanning time; wherein, the timing time of the second timing unit is the start timing time recorded by the second timing unit when the start triggering moment corresponding to the synchronization trigger signal is synchronized to the second timing unit.
  • the above-mentioned device also includes:
  • the fusion module 604 is configured to obtain the current image frame and the time of the current image frame stored based on the above data processing method in response to receiving the data fusion instruction after synchronously storing the current image frame and the time stamp information of the current image frame Stamp information; based on the time stamp information of the current image frame, the current image frame is fused with the collected data to obtain fused data.
  • FIG. 7 is a schematic structural diagram of the electronic device provided by the embodiment of the present disclosure, including: a processor 701 , a memory 702 , and a bus 703 .
  • the memory 702 stores machine-readable instructions executable by the processor 701 (for example, execution instructions corresponding to the acquisition module 601, the determination module 602, and the storage module 603 in the device in FIG. 6 ), and when the electronic device is running, the processor 701 Communicating with the memory 702 through the bus 703, the machine-readable instructions are executed by the processor 701 in the following steps:
  • the current image frame and the timestamp information of the current image frame are stored synchronously, and the timestamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
  • Embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of the data processing method described in the foregoing method embodiments are executed.
  • the storage medium may be a volatile or non-volatile computer-readable storage medium.
  • the embodiment of the present disclosure also provides a computer program product, the computer program product carries a program code, and the instructions included in the program code can be used to execute the steps of the data processing method described in the above method embodiment, for details, please refer to the above The method embodiment will not be repeated here.
  • the above-mentioned computer program product may be specifically implemented by means of hardware, software or a combination thereof.
  • the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK) etc. wait.
  • a software development kit Software Development Kit, SDK
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions are realized in the form of software function units and sold or used as independent products, they can be stored in a non-volatile computer-readable storage medium executable by a processor.
  • the technical solution of the present disclosure is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make an electronic device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in various embodiments of the present disclosure.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure provides a data processing method, apparatus, and system, a device, and a storage medium. The method comprises: in response to a received synchronous trigger signal from a positioning module, obtaining an image acquisition moment indicated by the synchronous trigger signal and an image exposure duration corresponding to a current image frame and acquired by an image sensor from the image acquisition moment (S101); determining timestamp information of the current image frame on the basis of the image acquisition moment and the image exposure duration corresponding to the current image frame (S102); and synchronously storing the current image frame and the timestamp information of the current image frame (S103), the timestamp information of the current image frame being used as reference time when the current image frame is fused with other pieces of acquired data. According to the present disclosure, the timestamp information is determined in combination with image exposure attributes, and timestamps determined by image sensors having different exposure attributes are also different, such that the determined timestamp information better fits the exposure time of a captured target, and better meets the requirements of an actual scene.

Description

一种数据处理的方法、装置、***、设备及存储介质A data processing method, device, system, equipment and storage medium
相关申请的交叉引用Cross References to Related Applications
本申请要求在2021年11月29日提交至中国专利局、申请号为CN2021114371064的中国专利申请的优先权,其全部内容通过引用结合在本公开中。This application claims priority to a Chinese patent application with application number CN2021114371064 filed with the China Patent Office on November 29, 2021, the entire contents of which are incorporated in this disclosure by reference.
技术领域technical field
本公开涉及计算机技术领域,具体而言,涉及一种数据处理的方法、装置、***、设备及存储介质。The present disclosure relates to the field of computer technology, and in particular, to a data processing method, device, system, equipment, and storage medium.
背景技术Background technique
自动驾驶是人工智能技术中的一个重要研究方向。为了提高自动驾驶***的可靠性,需要提高自动驾驶***对环境中的目标(例如车辆、行人等)的感知能力。自动驾驶***可以通过多传感器采集的数据的融合来检测环境信息以进行目标检测和跟踪等。Autonomous driving is an important research direction in artificial intelligence technology. In order to improve the reliability of the automatic driving system, it is necessary to improve the perception ability of the automatic driving system to the objects in the environment (such as vehicles, pedestrians, etc.). The autonomous driving system can detect environmental information through the fusion of data collected by multiple sensors for target detection and tracking.
为了实现各传感器采集的数据的融合,要求每帧传感器数据携带时间戳信息。在一些实施例中,可以在自动驾驶***接收到传感器数据的时候,把当前***时间添加到对应数据帧的时间标签里。In order to realize the fusion of data collected by each sensor, each frame of sensor data is required to carry time stamp information. In some embodiments, when the automatic driving system receives the sensor data, the current system time can be added to the time tag of the corresponding data frame.
发明内容Contents of the invention
本公开实施例至少提供一种数据处理的方法、装置、***、设备及存储介质。Embodiments of the present disclosure at least provide a data processing method, device, system, device, and storage medium.
第一方面,本公开实施例提供了一种数据处理的方法,所述方法包括:响应于接收到定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,以及一个或多个图像传感器之一自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长;基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;将所述当前图像帧以及所述当前图像帧的时间戳信息进行同步存储。In the first aspect, an embodiment of the present disclosure provides a data processing method, the method comprising: in response to receiving a synchronous trigger signal from a positioning module, acquiring the image acquisition moment indicated by the synchronous trigger signal, and one or more The image exposure duration corresponding to the current image frame collected by one of the image sensors from the image acquisition moment; based on the image acquisition moment and the image exposure duration corresponding to the current image frame, determine the time stamp information of the current image frame ; Synchronously storing the current image frame and the timestamp information of the current image frame.
第二方面,本公开实施例还提供了一种数据处理的装置,包括:获取模块,用于响应于接收到定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,以及图像传感器自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长;确定模块,用于基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;存储模块,用于将所述当前图像帧以及所述当前图像帧的时间 戳信息进行同步存储。In the second aspect, the embodiment of the present disclosure also provides a data processing device, including: an acquisition module, configured to acquire the image acquisition time indicated by the synchronization trigger signal in response to receiving the synchronization trigger signal of the positioning module, and the image The image exposure duration corresponding to the current image frame collected by the sensor since the image acquisition moment; a determining module, configured to determine the time of the current image frame based on the image acquisition moment and the image exposure duration corresponding to the current image frame Stamp information; a storage module, configured to store the current image frame and the time stamp information of the current image frame synchronously.
第三方面,本公开实施例还提供了一种数据处理的***,所述***包括:现场可编程门阵列FPGA运算单元和ARM处理器;所述ARM处理器,用于将全球定位***GPS定位单元生成的同步触发信号所指示的图像采集时刻以及图像传感器自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长读入所述存储器,并将所述图像采集时刻以及所述当前图像帧对应的图像曝光时长输入到所述FPGA运算单元;所述FPGA运算单元,用于基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;并将所述当前图像帧以及所述当前图像帧的时间戳信息同步存储到所述存储器。In a third aspect, the embodiment of the present disclosure also provides a system for data processing, the system includes: a field programmable gate array FPGA computing unit and an ARM processor; the ARM processor is used to locate the global positioning system GPS The image acquisition time indicated by the synchronous trigger signal generated by the unit and the image exposure time corresponding to the current image frame collected by the image sensor from the image acquisition time are read into the memory, and the image acquisition time and the current image The image exposure duration corresponding to the frame is input to the FPGA operation unit; the FPGA operation unit is used to determine the timestamp information of the current image frame based on the image acquisition moment and the image exposure duration corresponding to the current image frame ; and synchronously storing the current image frame and the timestamp information of the current image frame into the memory.
第四方面,本公开实施例还提供了一种电子设备,包括:处理器、存储器和总线,所述存储器存储有所述处理器可执行的机器可读指令,当电子设备运行时,所述处理器与所述存储器之间通过总线通信,所述机器可读指令被所述处理器执行时执行如第一方面及其各种实施方式任一所述的数据处理的方法的步骤。In a fourth aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory, and a bus, the memory stores machine-readable instructions executable by the processor, and when the electronic device is running, the The processor communicates with the memory through a bus, and when the machine-readable instructions are executed by the processor, the steps of the data processing method described in any one of the first aspect and its various implementation manners are executed.
第五方面,本公开实施例还提供了一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行如第一方面及其各种实施方式任一所述的数据处理的方法的步骤。In the fifth aspect, the embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, and the computer program is executed by a processor as in the first aspect and its various implementation modes A step in any of the described data processing methods.
为使本公开的上述目的、特征和优点能更明显易懂,下文特举较佳实施例,并配合所附附图,作详细说明如下。In order to make the above-mentioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments will be described in detail below together with the accompanying drawings.
附图说明Description of drawings
图1示出了本公开实施例所提供的一种数据处理的方法的流程图;FIG. 1 shows a flowchart of a data processing method provided by an embodiment of the present disclosure;
图2示出了本公开实施例所提供的一种数据处理的方法中,确定时间戳信息具体方法的流程图;FIG. 2 shows a flowchart of a specific method for determining timestamp information in a data processing method provided by an embodiment of the present disclosure;
图3a示出了本公开实施例所提供的一种数据处理的方法中,同步采集具体方法的流程图;FIG. 3a shows a flow chart of a specific method for synchronous acquisition in a data processing method provided by an embodiment of the present disclosure;
图3b示出了本公开实施例所提供的一种数据处理的方法中,同步采集具体方法的流程图;FIG. 3b shows a flow chart of a specific method for synchronous acquisition in a data processing method provided by an embodiment of the present disclosure;
图4示出了本公开实施例所提供的一种数据处理的方法的应用示意图;FIG. 4 shows an application schematic diagram of a data processing method provided by an embodiment of the present disclosure;
图5示出了本公开实施例所提供的一种数据处理的***的示意图;FIG. 5 shows a schematic diagram of a data processing system provided by an embodiment of the present disclosure;
图6示出了本公开实施例所提供的一种数据处理的装置的示意图;FIG. 6 shows a schematic diagram of a data processing device provided by an embodiment of the present disclosure;
图7示出了本公开实施例所提供的一种电子设备的示意图。Fig. 7 shows a schematic diagram of an electronic device provided by an embodiment of the present disclosure.
具体实施方式Detailed ways
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中附图,对本公开实施例中的技术方案进行清楚、完整地描述,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本公开实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本公开的实施例的详细描述并非旨在限制要求保护的本公开的范围,而是仅仅表示本公开的选定实施例。基于本公开的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。In order to make the purpose, technical solutions and advantages of the embodiments of the present disclosure clearer, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present disclosure. The described embodiments are only the present invention. Some, but not all, embodiments are disclosed. The components of the disclosed embodiments generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations. Accordingly, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the claimed disclosure, but merely represents selected embodiments of the present disclosure. Based on the embodiments of the present disclosure, all other embodiments obtained by those skilled in the art without creative effort shall fall within the protection scope of the present disclosure.
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。It should be noted that like numerals and letters denote similar items in the following figures, therefore, once an item is defined in one figure, it does not require further definition and explanation in subsequent figures.
本文中术语“和/或”,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中术语“至少一种”表示多种中的任意一种或多种中的至少两种的任意组合,例如,包括A、B、C中的至少一种,可以表示包括从A、B和C构成的集合中选择的任意一个或多个元素。The term "and/or" herein means that there may be three relationships, for example, A and/or B may mean that A exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a variety or any combination of at least two of the more, for example, including at least one of A, B, and C, which may mean including from A, Any one or more elements selected from the set formed by B and C.
经研究发现,在自动驾驶***中,为了实现各传感器采集的数据的融合,要求每帧传感器数据携带时间戳信息。在一些实施例中,可以在自动驾驶***接收到传感器数据的时候,把当前***时间添加到对应数据帧的时间标签里。After research, it is found that in the automatic driving system, in order to realize the fusion of data collected by each sensor, each frame of sensor data is required to carry time stamp information. In some embodiments, when the automatic driving system receives the sensor data, the current system time can be added to the time tag of the corresponding data frame.
然而,由于数据传输存在一定的时间延迟,这导致所确定的时间戳信息存在很大的误差。However, due to a certain time delay in data transmission, there is a large error in the determined time stamp information.
基于上述研究,本公开提供了一种数据处理的方法、装置、***、设备及存储介质,以确定更为准确的数据时间戳。Based on the above research, the present disclosure provides a data processing method, device, system, equipment and storage medium to determine a more accurate data time stamp.
为便于对本实施例进行理解,首先对本公开实施例所公开的一种数据处理的方法进行详细介绍,本公开实施例所提供的数据处理的方法的执行主体一般为具有一定计算能力的电子设备,该电子设备例如包括:终端设备、服务器或其它处理设备,终端设备可以为用户设备(User Equipment,UE)、移动设备、个人数字助理(Personal Digital Assistant, PDA)、手持设备、计算设备、车载设备、可穿戴设备等,其它处理设备可以是基于现场可编程门阵列(Field Programmable Gate Array,FPGA)等半定制电路构成的定制开发板。考虑到FPGA板的优良特性,接下来多以FPGA板作为执行主体进行示例说明。In order to facilitate the understanding of this embodiment, a data processing method disclosed in the embodiment of the present disclosure is first introduced in detail. The execution subject of the data processing method provided in the embodiment of the present disclosure is generally an electronic device with a certain computing capability. The electronic device includes, for example: a terminal device, a server or other processing equipment, and the terminal device can be a user equipment (User Equipment, UE), a mobile device, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, a vehicle-mounted device , wearable devices, etc., and other processing devices can be custom development boards based on semi-custom circuits such as Field Programmable Gate Array (Field Programmable Gate Array, FPGA). Considering the excellent characteristics of the FPGA board, the FPGA board will be used as an example to illustrate.
在一些可能的实现方式中,上述数据处理的方法可以通过处理器调用存储器中存储的计算机可读指令的方式来实现。In some possible implementation manners, the above data processing method may be implemented by a processor invoking computer-readable instructions stored in a memory.
参见图1所示,为本公开实施例提供的数据处理的方法的流程图,该数据处理方法包括以下步骤S101~S103。Referring to FIG. 1 , which is a flowchart of a data processing method provided by an embodiment of the present disclosure, the data processing method includes the following steps S101-S103.
S101:响应于接收到定位模块的同步触发信号,获取同步触发信号指示的图像采集时刻,以及图像传感器自图像采集时刻开始采集的当前图像帧对应的图像曝光时长;S101: In response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition time indicated by the synchronous trigger signal, and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time;
S102:基于图像采集时刻以及当前图像帧对应的图像曝光时长,确定当前图像帧的时间戳信息;S102: Determine the timestamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
S103:将当前图像帧以及当前图像帧的时间戳信息进行同步存储,当前图像帧的时间戳信息用于作为在将当前图像帧与其它采集数据进行融合时的参考时间。S103: Synchronously store the current image frame and the timestamp information of the current image frame, and the timestamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
为了便于理解本公开实施例提供的数据处理的方法,接下来首先对该方法的应用场景进行详细说明。本公开实施例中的数据处理的方法可以应用于图像传感器采集的图像的时间戳的确定。图像传感器作为自动驾驶领域不可或缺的传感器,其采集的图像的时间戳信息的准确性为后续的数据融合提供了有力的数据支撑。In order to facilitate understanding of the data processing method provided by the embodiment of the present disclosure, the application scenario of the method will first be described in detail below. The data processing method in the embodiments of the present disclosure may be applied to the determination of the time stamp of the image collected by the image sensor. As an indispensable sensor in the field of autonomous driving, the image sensor, the accuracy of the time stamp information of the collected images provides a strong data support for the subsequent data fusion.
相关技术中可以通过软件方式来确定时间戳信息,即在***收到传感器数据的时候,把当前***时间添加到对应数据帧的时间标签里作为对应数据帧的时间戳,由于数据传输存在一定的时间延迟,这种方式存在很大的误差。In related technologies, the time stamp information can be determined by software, that is, when the system receives sensor data, the current system time is added to the time tag of the corresponding data frame as the time stamp of the corresponding data frame. Time delay, there is a big error in this way.
除此之外,相关技术中还提供了一种硬件方式来确定时间戳信息,即通过硬件触发信号控制各传感器在同一个时刻采集数据,这种方法可以很好的控制传感器之间的同步采集,然而对于诸如摄像头这样的图像传感器而言,为了适应不同的光照环境,摄像头中大都集成了自动曝光(Automatic Exposure,AE)功能,摄像头在不同的光照环境下曝光时长存在差异,即便是同一光照环境,不同位置的相机曝光时长也不同,而由于曝光时长存在差异,导致基于上述硬件触发方式确定的时间戳不准确。In addition, the related technology also provides a hardware method to determine the time stamp information, that is, the hardware trigger signal is used to control each sensor to collect data at the same time. This method can well control the synchronous collection between sensors However, for image sensors such as cameras, in order to adapt to different lighting environments, most cameras integrate automatic exposure (Automatic Exposure, AE) function. Depending on the environment, the exposure time of cameras in different locations is also different, and due to the difference in exposure time, the time stamp determined based on the above hardware trigger method is inaccurate.
为了解决上述问题,本公开实施例提供了一种基于图像曝光时长确定时间戳信息数据处理的方法,使得所确定的时间戳信息更为准确。In order to solve the above problems, an embodiment of the present disclosure provides a data processing method for determining time stamp information based on image exposure time, so that the determined time stamp information is more accurate.
在自动驾驶领域,本公开实施例中的定位模块可以是设置在车辆的定位接收器,例如可以是全球定位***(Global Positioning System,GPS)接收器,还可以是全球导航卫星***(Global Navigation Satellite System,GNSS)接收器,还可以是其它定位接收器,这里不做具体的限制。In the field of automatic driving, the positioning module in the embodiments of the present disclosure may be a positioning receiver set on the vehicle, such as a Global Positioning System (Global Positioning System, GPS) receiver, or a Global Navigation Satellite System (Global Navigation Satellite System) receiver. System, GNSS) receiver, or other positioning receivers, which are not specifically limited here.
另外,本公开实施例中的图像传感器可以是设置在车辆上用于采集车辆周边环境的摄像头,有关摄像头的安装位置可以根据不同的场景需求来进行配置,例如,可以在车头、车尾、车身处各安装两个摄像头;再如,可以仅在车头处安装摄像头,具体配置方式在此不做限制。In addition, the image sensor in the embodiments of the present disclosure can be a camera installed on the vehicle to collect the surrounding environment of the vehicle. The installation position of the camera can be configured according to different scene requirements. Two cameras are installed at each location; as another example, cameras can be installed only at the front of the vehicle, and the specific configuration method is not limited here.
本公开实施例中的定位模块可以是通过天线接收卫星发送的基于原子钟的高精度时间信息,并输出国际电气制造商协会(National Electrical Manufactures Association,NEMA)格式的数据包及秒脉冲(Pulse Per Second,PPS)整秒时钟信号给到FPGA板。The positioning module in the embodiment of the present disclosure can receive the high-precision time information based on the atomic clock sent by the satellite through the antenna, and output the data packet and the pulse per second (Pulse Per Second) format of the International Electrical Manufacturers Association (National Electrical Manufacturers Association, NEMA) format. , PPS) the whole second clock signal is given to the FPGA board.
响应于接收到定位模块的同步触发信号,本公开实施例提供的数据处理的方法可以获取同步触发信号指示的图像采集时刻以及图像传感器自图像采集时刻开始采集的当前图像帧对应的图像曝光时长。In response to receiving the synchronous trigger signal of the positioning module, the data processing method provided by the embodiment of the present disclosure can obtain the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time.
其中,上述图像采集时刻可以指示,在同步触发信号的触发下,针对图像传感器的图像采集触发信号的上升沿对应的触发时刻,在该触发时刻下,图像传感器开始进行图像采集操作。针对不同的图像传感器可以预先设置相同的触发时刻,以实现多个图像传感器的同步触发,也可以预先设置不同的触发时刻,以便配合其他传感器的采集需求。Wherein, the above-mentioned image acquisition time may indicate, under the trigger of the synchronous trigger signal, the trigger time corresponding to the rising edge of the image acquisition trigger signal for the image sensor, at which trigger time, the image sensor starts to perform the image acquisition operation. The same trigger time can be preset for different image sensors to realize synchronous triggering of multiple image sensors, and different trigger times can also be preset to meet the acquisition requirements of other sensors.
在实际应用中,图像传感器内部结构具有多行感光元件,例如全局快门(global shutter)或卷帘快门(rolling shutter)。考虑到不同行图像数据(像素)所对应的图像曝光时长可以是相同的,因而,这里的图像曝光时长可以是针对行图像数据的曝光时长。In practical applications, the internal structure of the image sensor has multiple rows of photosensitive elements, such as a global shutter or a rolling shutter. Considering that image exposure durations corresponding to different rows of image data (pixels) may be the same, therefore, the image exposure duration here may be an exposure duration for row image data.
本公开实施例中,基于图像采集时刻以及当前图像帧对应的图像曝光时长可以确定当前图像帧的时间戳信息,也即,在图像曝光时长不同的情况下,所确定的时间戳信息也不同,从而可以解决由于曝光时长的差异对时间戳准确性造成的影响。In the embodiment of the present disclosure, the time stamp information of the current image frame can be determined based on the image acquisition time and the image exposure time corresponding to the current image frame, that is, when the image exposure time is different, the determined time stamp information is also different. In this way, the impact of the difference in exposure time on the accuracy of the time stamp can be resolved.
在将当前图像帧以及当前图像帧的时间戳信息同步存储的情况下,可以实现具有时间戳信息的当前图像帧与其它采集数据的融合。这里的其它采集数据可以是与当前图像帧进行融合的任意数据。其中,上述数据融合方案可以是基于工控机(Industrial Personal Computer,IPC)实现的。In the case of storing the current image frame and the time stamp information of the current image frame synchronously, the fusion of the current image frame with the time stamp information and other collected data can be realized. The other collected data here may be any data that is fused with the current image frame. Wherein, the above data fusion solution may be implemented based on an Industrial Personal Computer (IPC).
本公开实施例中,响应于接收到数据融合指令,获取存储的所述当前图像帧以及所 述当前图像帧的时间戳信息;基于所述当前图像帧的时间戳信息,将所述当前图像帧与已采集数据进行融合,得到融合数据。In the embodiment of the present disclosure, in response to receiving the data fusion instruction, the stored current image frame and the timestamp information of the current image frame are acquired; based on the timestamp information of the current image frame, the current image frame is It is fused with the collected data to obtain fused data.
采用上述数据处理的方法,在响应到定位模块的同步触发信号的情况下,可以获取同步触发信号指示的图像采集时刻以及采集当前图像帧对应的图像曝光时长,而后基于图像采集时刻以及对应的图像曝光时长可以确定当前图像帧的时间戳信息。本公开结合图像曝光属性实现时间戳信息的确定,不同曝光属性的图像传感器所确定的时间戳也不同,这一定程度上使得所确定的时间戳信息更为贴合所抓拍目标的曝光时间,更为符合实际场景的需求。Using the above data processing method, in response to the synchronous trigger signal of the positioning module, the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the acquisition of the current image frame can be obtained, and then based on the image acquisition time and the corresponding image The exposure time can determine the timestamp information of the current image frame. This disclosure realizes the determination of time stamp information in combination with image exposure attributes. The time stamps determined by image sensors with different exposure attributes are also different, which makes the determined time stamp information more suitable for the exposure time of the captured target to a certain extent. In order to meet the needs of the actual scene.
就自动驾驶场景而言,由于所确定的时间戳信息足够准确,这样所融合得到的数据也能够更准确的表征自动驾驶周边环境,从而可以指导更为安全的自动驾驶操作。As far as the autonomous driving scenario is concerned, since the determined time stamp information is accurate enough, the fused data can also more accurately characterize the surrounding environment of autonomous driving, thereby guiding safer autonomous driving operations.
在一个例子中,时间戳确定过程可以包括如下步骤:In an example, the timestamp determination process may include the following steps:
步骤一、基于图像采集时刻以及预设时间差间隔,确定当前图像帧中第一行图像数据的行曝光起止时刻; Step 1. Based on the image acquisition time and the preset time difference interval, determine the line exposure start and end time of the first line of image data in the current image frame;
步骤二、基于当前图像帧中第一行图像数据的行曝光起止时刻以及第一行图像数据对应的图像曝光时长,确定当前图像帧的时间戳信息。Step 2: Determine the time stamp information of the current image frame based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data.
这里的时间戳信息可以指示的是当前图像帧的曝光中心时刻,这主要是考虑到对于曝光中心时刻所对应的图像中心像素点而言,对应是图像感兴趣目标的可能性会更大,这样,在将曝光中心时刻作为图像帧的时间戳信息的情况下,可以显著提升时间戳的准确性。The time stamp information here can indicate the exposure center moment of the current image frame. This is mainly because the image center pixel corresponding to the exposure center moment is more likely to correspond to the object of interest in the image. In this way , when the exposure center moment is used as the timestamp information of the image frame, the accuracy of the timestamp can be significantly improved.
为了确定出图像曝光中心时刻,这里需要结合图像传感器的内部同步机制来说明。图像传感器的内部同步机制可以采用左对齐的方式,即图像采集触发信号与图像曝光起始时刻之间的时间是固定的(图像采集触发信号与图像曝光起始时刻之间存在预设时间差间隔);图像传感器的内部同步机制还可以采用右对齐的方式,即图像采集触发信号与图像曝光终止时刻之间的时间是固定的(图像采集触发信号与图像曝光终止时刻之间存在预设时间差间隔)。曝光起止时刻包括图像曝光起始时刻或图像曝光终止时刻。为了便于进行说明,接下来以右对齐进行示例。In order to determine the central moment of image exposure, it needs to be explained in conjunction with the internal synchronization mechanism of the image sensor. The internal synchronization mechanism of the image sensor can be left-aligned, that is, the time between the image acquisition trigger signal and the image exposure start moment is fixed (there is a preset time difference interval between the image acquisition trigger signal and the image exposure start moment) ; The internal synchronization mechanism of the image sensor can also adopt a right alignment method, that is, the time between the image acquisition trigger signal and the image exposure termination moment is fixed (there is a preset time difference interval between the image acquisition trigger signal and the image exposure termination moment) . The exposure start and end moments include the image exposure start moment or the image exposure end moment. For the sake of illustration, the following example is right-aligned.
考虑到不同图像传感器的曝光方式也不同,有的采用的是所有行图像数据同时曝光(例如全局快门),有的采用的是相邻行图像数据之间的曝光时间差为一个时间单位的曝光方式(例如卷帘快门),这里,在确定第一行图像数据的行曝光起止时刻的情况下, 结合对应图像传感器的曝光方式可以确定出中间行图像数据的行曝光中心时刻,该时刻即可以作为图像帧的时间戳信息。接下来可以分别针对上述两种曝光方式来具体阐述当前图像帧的时间戳信息的确定方案。Considering that different image sensors have different exposure methods, some use simultaneous exposure of all rows of image data (such as global shutter), and some use an exposure method in which the exposure time difference between adjacent rows of image data is one time unit (such as rolling shutter), here, in the case of determining the row exposure start and end moments of the first row of image data, the row exposure center moment of the middle row of image data can be determined in combination with the exposure mode of the corresponding image sensor, and this moment can be used as Timestamp information for image frames. Next, the schemes for determining the time stamp information of the current image frame can be specifically described for the above two exposure modes respectively.
针对所有行图像数据同时曝光的情况,可以按照如下步骤确定当前图像帧的时间戳信息:For the case where all rows of image data are exposed at the same time, the timestamp information of the current image frame can be determined according to the following steps:
步骤一、基于当前图像帧中第一行图像数据的行曝光起止时刻以及第一行图像数据对应的图像曝光时长,得到第一行图像数据的行曝光中心时刻; Step 1. Obtain the row exposure center moment of the first row of image data based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding image exposure duration of the first row of image data;
步骤二、将第一行图像数据的行曝光中心时刻确定为当前图像帧的时间戳信息。Step 2: Determine the row exposure center moment of the first row of image data as the time stamp information of the current image frame.
这里,由于第一行图像数据对应的是整个图像曝光时长,则从开始曝光到第一行图像数据的中心像素点或者从第一行图像数据的中心像素点到结束曝光的图像曝光时长为整个图像曝光时长的一半,继而将第一行图像数据的行曝光起止时刻与上述图像曝光时长的一半进行差值运算,得到第一行图像数据的行曝光中心时刻。Here, since the first row of image data corresponds to the entire image exposure duration, the image exposure duration from the start of exposure to the center pixel of the first row of image data or from the center pixel of the first row of image data to the end of exposure is the entire half of the image exposure time, and then perform difference calculation between the row exposure start and end time of the first row of image data and half of the above image exposure duration to obtain the row exposure center time of the first row of image data.
由于当前图像帧的所有行图像数据同时曝光,因而中间行图像数据的行曝光中心时刻与第一行图像数据的行曝光中心时刻为同一时刻,因而可以将第一行图像数据的行曝光中心时刻确定为当前图像帧的时间戳信息。Since all row image data of the current image frame are exposed at the same time, the row exposure center moment of the middle row image data is the same as the row exposure center moment of the first row image data, so the row exposure center moment of the first row image data can be Determine the timestamp information of the current image frame.
针对相邻行图像数据的行曝光起始时刻相差一个时间单位的情况,可以按照如下步骤确定当前图像帧的时间戳信息:For the case where the row exposure start times of adjacent rows of image data differ by one time unit, the timestamp information of the current image frame can be determined according to the following steps:
步骤一、基于当前图像帧中第一行图像数据的行曝光起止时刻以及第一行图像数据对应的图像曝光时长,得到第一行图像数据的行曝光中心时刻; Step 1. Obtain the row exposure center moment of the first row of image data based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding image exposure duration of the first row of image data;
步骤二、基于第一行图像数据的行曝光中心时刻、以及当前图像帧中的中间行图像数据与第一行图像数据之间的时间单位数量,确定中间行图像数据的行曝光中心时刻;Step 2. Determine the row exposure center moment of the middle row image data based on the row exposure center moment of the first row image data and the number of time units between the middle row image data and the first row image data in the current image frame;
步骤三、将中间行图像数据的行曝光中心时刻确定为当前图像帧的时间戳信息。Step 3: Determine the row exposure center moment of the middle row image data as the time stamp information of the current image frame.
在一个例子中,可以先确定第一行图像数据的行曝光中心时刻。由于当前图像帧的相邻行图像数据的曝光起始时刻相差一个时间单位曝光,因而中间行图像数据与第一行图像数据相差多少行,对应的两个行曝光中心时刻则相差多少个时间单位数量,从而可以基于第一行图像数据的行曝光中心时刻确定出中间行图像数据的行曝光中心时刻,并可以确定为当前图像帧的时间戳信息。In an example, the row exposure center moment of the first row of image data may be determined first. Since the exposure starting time of adjacent rows of image data in the current image frame differs by one time unit exposure, how many rows are different between the middle row image data and the first row of image data, and how many time units are the difference between the corresponding two row exposure center moments Therefore, the row exposure center moment of the middle row image data can be determined based on the row exposure center moment of the first row image data, and can be determined as the time stamp information of the current image frame.
为了便于理解上述时间戳信息的确定过程,接下来可以结合图2以及相关的公式做 进一步说明。In order to facilitate the understanding of the determination process of the above timestamp information, further description can be made in conjunction with Figure 2 and related formulas.
在一个例子中,如图2所示,帧同步器(Frame SYNChronizer,FSYNC)可以是由FPGA板中的同步触发信号发生模块提供,频率以实际的摄像头帧率为准,这里可以设定为30Hz。图像传感器收到FSYNC同步信号后,图像传感器内部的锁相环(phase locked loop)开始工作,工作一段时间之后进入锁定状态。在锁定状态下,图像传感器内部的场同步器(Vertical synchronizer,VSYNC)与外部输入的FSYNC同频同相。In one example, as shown in Figure 2, the frame synchronizer (Frame SYNChronizer, FSYNC) can be provided by the synchronous trigger signal generation module in the FPGA board, and the frequency is based on the actual camera frame rate, which can be set to 30Hz here . After the image sensor receives the FSYNC synchronization signal, the phase locked loop (phase locked loop) inside the image sensor starts to work, and enters the locked state after working for a period of time. In the locked state, the vertical synchronizer (Vertical synchronizer, VSYNC) inside the image sensor is the same frequency and phase as the external input FSYNC.
对于采用右对齐的图像传感器而言,图像采集触发信号上升沿与每一行图像数据读出时刻(也是行曝光终止时刻)时间间隔固定。其中,t0为第N图像帧的FSYNC同步触发信号所指示的图像采集时刻,t1为第一行图像数据的行曝光终止时刻,且与t0时刻的时间差为常数T(readout),t2为第一行图像数据的行曝光中心时刻,当前图像帧的行曝光时长为T(exposure)。For the right-aligned image sensor, the time interval between the rising edge of the image acquisition trigger signal and the readout moment of each row of image data (also the row exposure termination moment) is fixed. Among them, t0 is the image acquisition moment indicated by the FSYNC synchronous trigger signal of the Nth image frame, t1 is the row exposure termination moment of the first row of image data, and the time difference with the t0 moment is a constant T (readout), and t2 is the first At the center moment of the row exposure of the row image data, the row exposure duration of the current image frame is T(exposure).
根据图2可知的是,当前图像帧的第一行图像数据的行曝光中心时刻为t2=t0+T(readout)-0.5*T(exposure)。According to FIG. 2 , it can be seen that the row exposure center time of the first row of image data of the current image frame is t2=t0+T(readout)-0.5*T(exposure).
对于全局快门的图像传感器而言,所有行都是同时曝光的,此时对应的当前图像帧的时间戳信息t为:t=t2=t0+T(readout)-0.5*T(exposure)。For an image sensor with a global shutter, all rows are exposed at the same time, and the time stamp information t of the corresponding current image frame at this time is: t=t2=t0+T(readout)-0.5*T(exposure).
对于卷帘快门的图像传感器而言,相邻行的曝光时间相差一个时间单位1H的时间,假设图像的分辨率为Wide x High,Wide表示该图像中水平方向上的像素点数,High表示该图像中垂直方向上的像素点数,此时对应的当前图像帧的时间戳信息t为:For the image sensor of the rolling shutter, the exposure time of adjacent rows differs by a time unit of 1H. Assuming that the resolution of the image is Wide x High, Wide represents the number of pixels in the horizontal direction in the image, and High represents the image The number of pixels in the vertical direction, the time stamp information t of the corresponding current image frame at this time is:
t=t2+H*(High-1)=t0+T(readout)-0.5*T(exposure)+H*(High-1)。t=t2+H*(High-1)=t0+T(readout)-0.5*T(exposure)+H*(High-1).
其中,T(readout)和H的值可以根据具体的传感器确定,T(exposure)的值可以从移动产业处理器接口(Mobile Industry Processor Interface,mipi)数据流中获取。Among them, the values of T (readout) and H can be determined according to specific sensors, and the value of T (exposure) can be obtained from the data stream of the Mobile Industry Processor Interface (mipi).
本公开实施例中,在从图像传感器获取到图像采集时刻t0以及当前图像帧对应的图像曝光时长T(exposure)的情况下,可以基于不同的曝光方式按照上述具体实施方式确定时间戳信息,从而给每个图像帧赋予精准的时间戳信息。In the embodiment of the present disclosure, when the image acquisition time t0 and the image exposure duration T (exposure) corresponding to the current image frame are obtained from the image sensor, the time stamp information can be determined according to the above specific implementation methods based on different exposure methods, so that Assign precise timestamp information to each image frame.
在一些实施例中,基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及所述第一行图像数据对应的行曝光时长,确定所述当前图像帧的时间戳信息,包括:确定图像传感器的曝光方式;响应于确定所述图像传感器的曝光方式是所有行图像数据同时曝光,基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及所述第一行图像数据对应的图像曝光时长,得到所述第一行图像数据的行曝光中心时刻,将所述第一行图 像数据的行曝光中心时刻确定为所述当前图像帧的时间戳信息;响应于确定所述图像传感器的曝光方式是相邻行图像数据的行曝光起始时刻差为一个时间单位,基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及所述第一行图像数据对应的行曝光时长,得到所述第一行图像数据的行曝光中心时刻,基于所述第一行图像数据的行曝光中心时刻、以及所述当前图像帧中的中间行图像数据与所述第一行图像数据之间的时间单位数量,确定所述中间行图像数据的行曝光中心时刻,将所述中间行图像数据的行曝光中心时刻确定为所述当前图像帧的时间戳信息。In some embodiments, the time stamp information of the current image frame is determined based on the start and end moments of row exposure of the first row of image data in the current image frame and the row exposure duration corresponding to the first row of image data, including: Determine the exposure mode of the image sensor; in response to determining that the exposure mode of the image sensor is simultaneous exposure of all row image data, based on the row exposure start and end times of the first row image data in the current image frame and the first row image data Corresponding to the image exposure duration, obtain the row exposure center moment of the first row of image data, and determine the row exposure center moment of the first row of image data as the timestamp information of the current image frame; in response to determining the The exposure method of the image sensor is that the row exposure start time difference of adjacent rows of image data is one time unit, based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding row exposure time, obtain the row exposure center moment of the first row image data, based on the row exposure center moment of the first row image data, and the middle row image data in the current image frame and the first row The number of time units between image data, determine the row exposure center moment of the intermediate row image data, and determine the row exposure center moment of the intermediate row image data as the time stamp information of the current image frame.
在实际应用中,为了更好的适应于应用场景,可以联合多个图像传感器实现图像数据的融合。这里的多个图像传感器可以是同步采集图像帧,具体可以通过如下方式确定同步触发信号指示的图像采集时刻:In practical applications, in order to better adapt to application scenarios, multiple image sensors can be combined to achieve image data fusion. The multiple image sensors here can acquire image frames synchronously, specifically, the image acquisition moment indicated by the synchronous trigger signal can be determined in the following manner:
步骤一、响应于接收到定位模块的同步触发信号,针对多个图像传感器中每个图像传感器,生成用于触发该图像传感器进行当前图像帧采集的图像采集触发信号; Step 1. In response to receiving the synchronous trigger signal of the positioning module, for each image sensor in the plurality of image sensors, generate an image acquisition trigger signal for triggering the image sensor to acquire the current image frame;
步骤二、将该图像采集触发信号的上升沿对应的触发时刻,确定为该图像传感器的图像采集时刻。Step 2: Determine the trigger moment corresponding to the rising edge of the image acquisition trigger signal as the image acquisition moment of the image sensor.
这里,可以基于同步触发信号,生成用于触发图像传感器图像帧采集的图像采集触发信号。针对每个图像传感器,生成该图像传感器对应的一个图像采集触发信号,每个图像采集触发信号是同频同相的,也即,不同的图像传感器是同步进行图像采集的,可以适应于以视觉感知为主的应用场景,多个图像传感器可以同时曝光。Here, an image acquisition trigger signal for triggering image frame acquisition of the image sensor may be generated based on the synchronous trigger signal. For each image sensor, an image acquisition trigger signal corresponding to the image sensor is generated. Each image acquisition trigger signal is the same frequency and phase, that is, different image sensors are synchronously image acquisition, which can be adapted to visual perception In the main application scenario, multiple image sensors can be exposed at the same time.
本公开实施例中的多个图像传感器可以是相差预设时间间隔进行图像帧采集,具体可以通过如下方式确定同步触发信号指示的图像采集时刻:The multiple image sensors in the embodiments of the present disclosure may collect image frames with a preset time interval, specifically, the image collection moment indicated by the synchronous trigger signal may be determined in the following manner:
步骤一、响应于接收到定位模块的同步触发信号,基于多个图像传感器之间的预设时间间隔,对同步触发信号进行变频处理和/或变相处理,得到用于触发多个图像传感器分别进行当前图像帧采集的图像采集触发信号;多个图像传感器的图像采集触发信号的上升沿之间的时间间隔满足预设时间间隔; Step 1. In response to receiving the synchronous trigger signal of the positioning module, based on the preset time interval between the multiple image sensors, the synchronous trigger signal is subjected to frequency conversion processing and/or phase-distortion processing to obtain the signals used to trigger multiple image sensors to respectively perform The image acquisition trigger signal of the current image frame acquisition; the time interval between the rising edges of the image acquisition trigger signals of multiple image sensors meets the preset time interval;
步骤二、针对多个图像传感器中的每个图像传感器,将该图像传感器的图像采集触发信号的上升沿对应的触发时刻确定为该图像传感器的图像采集时刻。Step 2: For each image sensor in the plurality of image sensors, determine the trigger moment corresponding to the rising edge of the image acquisition trigger signal of the image sensor as the image acquisition moment of the image sensor.
这里,可以通过对同步触发信号的变频或变相处理生成图像采集触发信号,也即,不同的图像传感器可以是非同步进行图像采集的,可以适应于以激光雷达(Lidar)感知为主的应用场景,图像传感器需要根据具体的安装位置,调整触发信号的初始相位,保 证图像帧与点云数据对齐。Here, the image acquisition trigger signal can be generated by frequency conversion or phase-distortion processing of the synchronous trigger signal, that is, different image sensors can perform image acquisition asynchronously, and can be adapted to application scenarios based on Lidar (Lidar) perception. The image sensor needs to adjust the initial phase of the trigger signal according to the specific installation position to ensure that the image frame is aligned with the point cloud data.
为了便于进一步理解上述有关图像传感器的同步配置方式,接下来可以结合8个图像传感器(Camera0-Camera7)、一个Lidar、一个惯性测量单元(Inertial Measurement Unit,IMU)为例进行具体说明。In order to further understand the synchronous configuration of the above-mentioned image sensors, the following can be combined with 8 image sensors (Camera0-Camera7), a Lidar, and an inertial measurement unit (Inertial Measurement Unit, IMU) as an example for a specific description.
如图3a和3b所示,FPGA板中的同步信号发生模块(Trigger Module)接收GNSS模块的PPS秒时钟信号后,经过锁相、倍频产生Camera及IMU所需频率的触发信号,相位误差±10ns。其中,FSYNC0~FSYNC7用于控制8个Camera的同步采集,FSYNC8用来触发IMU的同步采集,频率/相位可变。As shown in Figures 3a and 3b, after receiving the PPS second clock signal of the GNSS module, the synchronous signal generation module (Trigger Module) in the FPGA board generates the trigger signal of the frequency required by the Camera and IMU through phase locking and frequency multiplication, and the phase error ± 10ns. Among them, FSYNC0~FSYNC7 are used to control the synchronous acquisition of 8 cameras, and FSYNC8 is used to trigger the synchronous acquisition of IMU, and the frequency/phase is variable.
如图3a和所示,8个图像传感器同频同相,可以是由GNSS的PPS信号变频后得到的,这将使得8个图像传感器能够同时曝光。同理,有关Lidar和IMU也可以是由PPS信号变频后得到的。这里仅为一个具体的示例,有关具体的变频系数的调整可以结合不同的应用需求来确定,在此不做赘述。As shown in Figure 3a and , the 8 image sensors have the same frequency and phase, which can be obtained by converting the frequency of the GNSS PPS signal, which will enable the 8 image sensors to be exposed at the same time. Similarly, related Lidar and IMU can also be obtained by PPS signal frequency conversion. This is only a specific example, and the adjustment of specific frequency conversion coefficients may be determined in combination with different application requirements, and details are not described here.
如图3b所示,8个图像传感器中的Camera0-Camera2是同频同向相工作的,而有关Camera3-Camera7则可以是不同频和/或不同相的,这将使得图像传感器存在一定的曝光时间差,曝光时间差直接影响了图像传感器的图像采集顺序。有关Lidar和IMU的采集操作,参见上述描述内容,在此不再赘述。As shown in Figure 3b, Camera0-Camera2 of the 8 image sensors work at the same frequency and phase, while related Camera3-Camera7 can have different frequencies and/or different phases, which will make the image sensor have a certain exposure The time difference, the exposure time difference directly affects the image acquisition sequence of the image sensor. For the acquisition operations of Lidar and IMU, refer to the above description, and will not repeat them here.
在实际应用中,若Lidar采用旋转式,上述8个Camera可以是相差预设时间间隔实现图像采集。具体可以通过如下步骤来确定预设时间间隔:In practical applications, if the lidar adopts a rotating type, the above-mentioned 8 cameras can achieve image acquisition with a preset time interval. Specifically, the preset time interval can be determined through the following steps:
步骤一、获取雷达设备的扫描范围;雷达设备与图像传感器针对同一场景进行采集; Step 1. Obtain the scanning range of the radar device; the radar device and the image sensor collect for the same scene;
步骤二、将扫描范围划分为多个扫描子范围;针对多个图像传感器中的每个图像传感器,该图像传感器负责采集多个扫描子范围中的一个扫描子范围内的图像;Step 2, dividing the scanning range into a plurality of scanning sub-ranges; for each image sensor in the plurality of image sensors, the image sensor is responsible for collecting images in one scanning sub-range in the plurality of scanning sub-ranges;
步骤三、基于雷达设备针对多个扫描子范围中的每个扫描子范围的扫描时间,确定多个图像传感器中相邻两个图像传感器之间的预设时间间隔。Step 3: Based on the scanning time of the radar device for each scanning sub-range in the multiple scanning sub-ranges, determine a preset time interval between two adjacent image sensors among the multiple image sensors.
这里的雷达设备和图像传感器可以是针对同一场景进行采集。在实际应用中,雷达设备可以是与图像传感器一起安装在自动驾驶车辆上的。这样,基于雷达设备的各个扫描子范围与各个图像传感器之间的对应关系,可以确定相邻两个图像传感器之间的预设时间间隔,从而可以确定出各个图像传感器的安装位置,便于进行场景部署。The radar device and the image sensor here can collect for the same scene. In practical applications, radar equipment can be installed on self-driving vehicles together with image sensors. In this way, based on the corresponding relationship between each scanning sub-range of the radar device and each image sensor, the preset time interval between two adjacent image sensors can be determined, so that the installation position of each image sensor can be determined, which is convenient for scene deploy.
结合雷达设备的扫描范围确定图像传感器之间的预设时间间隔,也即,随着雷达设 备的扫描子范围可以触发多个图像传感器的采集操作,从而实现雷达设备和多个图像传感器之间的同步采集,适应性更好。The preset time interval between the image sensors is determined in combination with the scanning range of the radar device, that is, the acquisition operation of multiple image sensors can be triggered with the scanning sub-range of the radar device, thereby realizing the communication between the radar device and multiple image sensors Synchronous acquisition, better adaptability.
本公开实施例中,响应于接收到定位模块的同步触发信号,可以获取惯性测量装置所捕捉的目标姿态数据的姿态捕捉时刻,在将目标姿态数据的姿态捕捉时刻确定为目标姿态数据的时间戳信息,并将目标姿态数据以及目标姿态数据的时间戳信息进行同步存储。In the embodiment of the present disclosure, in response to receiving the synchronous trigger signal of the positioning module, the attitude capture time of the target attitude data captured by the inertial measurement device can be obtained, and the attitude capture time of the target attitude data is determined as the time stamp of the target attitude data information, and synchronously store the target attitude data and the time stamp information of the target attitude data.
这样,在需要进行图像帧与目标姿态数据融合的情况下,可以基于目标姿态数据的时间戳信息与图像帧的时间戳信息之间的接近情况实现数据融合,所融合得到的数据也更为符合当前的应用场景。In this way, in the case of image frame and target pose data fusion, data fusion can be realized based on the proximity between the timestamp information of the target pose data and the timestamp information of the image frame, and the fused data is more in line with current application scenarios.
本公开实施例中,响应于接收到定位模块的同步触发信号,还可以获取雷达设备所扫描的点云数据的点云扫描时刻,在将点云扫描时刻确定为点云数据的时间戳信息的情况下,将点云数据以及点云数据的时间戳信息进行同步存储。In the embodiment of the present disclosure, in response to receiving the synchronous trigger signal of the positioning module, the point cloud scanning time of the point cloud data scanned by the radar device can also be obtained, and the point cloud scanning time is determined as the time stamp information of the point cloud data In this case, the point cloud data and the time stamp information of the point cloud data are stored synchronously.
这样,在需要进行图像帧与点云数据融合的情况下,可以基于点云数据的时间戳信息与图像帧的时间戳信息之间的接近情况实现数据融合,所融合得到的数据也更为符合当前的应用场景。In this way, in the case of fusion of image frames and point cloud data, data fusion can be realized based on the proximity between the time stamp information of point cloud data and the time stamp information of image frames, and the fused data is more in line with current application scenarios.
在实际应用中,还可以实现上述三种数据的融合,在此不做具体的限制。In practical applications, the fusion of the above three types of data may also be implemented, and no specific limitation is set here.
针对不同类型的雷达设备而言,所确定的点云扫描时刻可以是基于不同的授时方案确定的。For different types of radar equipment, the determined point cloud scanning time may be determined based on different timing schemes.
其一,可以基于GPS卫星时间来同步;其二,可以基于高精度时间同步协议(Precision Time Protocol,PTP)实现的时间同步。First, it can be synchronized based on GPS satellite time; second, it can be based on the time synchronization realized by the Precision Time Protocol (PTP).
具体的,可以将同步触发信号对应的触发时刻确定为所述点云扫描时刻;或者,雷达设备可以在雷达设备中包括的第一计时单元与以太网控制器中包括的第二计时单元实时同步的情况下,获取第二计时单元的计时时间,并将第二计时单元的计时时间作为点云扫描时刻,这里的点云扫描时刻可以是在点云扫描的过程中,某一确定相位对应的扫描时刻。其中,所述第二计时单元的计时时间是在将所述同步触发信号对应的触发时刻同步到所述第二计时单元时,所述第二计时单元记录的开始计时时间。其中,有关第二计时单元的计时时间可以是基于***时间的同步实现的。具体可以结合图4进行示例说明。Specifically, the trigger moment corresponding to the synchronous trigger signal can be determined as the point cloud scanning moment; or, the radar device can synchronize the first timing unit included in the radar device with the second timing unit included in the Ethernet controller in real time In the case of , the timing time of the second timing unit is obtained, and the timing time of the second timing unit is used as the point cloud scanning time. The point cloud scanning time here can be the point cloud corresponding to a certain phase during the point cloud scanning process. scan time. Wherein, the timing time of the second timing unit is the start timing time recorded by the second timing unit when the trigger moment corresponding to the synchronization trigger signal is synchronized to the second timing unit. Wherein, the timing time related to the second timing unit can be realized based on the synchronization of the system time. A specific example may be described in conjunction with FIG. 4 .
如图4所示,在FPGA接收到GNSS定位模块输出的NEMA格式的数据包及PPS 信号之后,分成了两个物理链路:一条链路是直接传给Lidar0,用做Lidar0的GPS同步输入(对应的是上述第一种同步方式);另一条链路给到了SoC(***级芯片,System on Chip)芯片中的ARM模块,通过Chrony工具接收GPS时间信息后,同步给***时间(System Time)。接着,用phc2sys工具将***时间同步到以太网控制器(Ethernet Controller,GEM)中的第二计时单元,搭建PTP Master,利用网络交换机(Network Switcher)用于Lidar1、Lidar2的授时(对应的是上述第二种同步方式)。As shown in Figure 4, after the FPGA receives the NEMA format data packet and PPS signal output by the GNSS positioning module, it is divided into two physical links: one link is directly transmitted to Lidar0, which is used as the GPS synchronization input of Lidar0 ( Corresponding to the above-mentioned first synchronization method); the other link is given to the ARM module in the SoC (System on Chip) chip, and after receiving the GPS time information through the Chrony tool, it is synchronized to the system time (System Time) . Next, use the phc2sys tool to synchronize the system time to the second timing unit in the Ethernet Controller (GEM), build a PTP Master, and use the Network Switcher (Network Switcher) for Lidar1 and Lidar2 timing (corresponding to the above second synchronization method).
除此之外,本公开实施例还可以通过Chrony工具搭建NTP服务器(Network Time Protocol Server),用来给IPC(IP Camera)授时,IPC可以基于不同的融合需求实现数据之间的融合。In addition, the embodiments of the present disclosure can also build an NTP server (Network Time Protocol Server) through the Chrony tool to serve time to the IPC (IP Camera), and the IPC can realize data fusion based on different fusion requirements.
如图4所示,触发模块可以利用与FSYNC同频同相的TrigX信号上升沿触发GEM中的第二计时单元,通过内部高速AXI(Advanced eXtensible Interface)总线读取第二计时单元中的计时时间。As shown in Figure 4, the trigger module can use the rising edge of the TrigX signal with the same frequency and phase as FSYNC to trigger the second timing unit in the GEM, and read the timing time in the second timing unit through the internal high-speed AXI (Advanced eXtensible Interface) bus.
如图4可知的是,上述有关雷达设备的时间同步可以是基于ARM实现的,在实际应用中,也可以是集成在FPGA板上实现的,在此不做具体的限制。As can be seen from FIG. 4, the time synchronization of the radar equipment mentioned above can be implemented based on ARM, and in practical applications, it can also be implemented on an FPGA board, and no specific limitation is set here.
这里,基于FPGA运算单元执行的上述数据处理的方法,本公开实施例还提供了一种数据处理的***,如图5所示,可以包括:FPGA运算单元501和ARM处理器502。Here, based on the above data processing method performed by the FPGA computing unit, the embodiment of the present disclosure also provides a data processing system, as shown in FIG. 5 , which may include: an FPGA computing unit 501 and an ARM processor 502 .
ARM处理器502,用于将全球定位***GPS定位单元生成的同步触发信号所指示的图像采集时刻以及图像传感器自图像采集时刻开始采集的当前图像帧对应的图像曝光时长读入存储器,并将图像采集时刻以及当前图像帧对应的图像曝光时长输入到FPGA运算单元501;The ARM processor 502 is used to read the image acquisition time indicated by the synchronous trigger signal generated by the global positioning system GPS positioning unit and the image exposure time corresponding to the current image frame collected by the image sensor from the image acquisition time into the memory, and image The acquisition time and the image exposure duration corresponding to the current image frame are input to the FPGA operation unit 501;
FPGA运算单元501,用于基于图像采集时刻以及当前图像帧对应的图像曝光时长,确定当前图像帧的时间戳信息;并将当前图像帧以及当前图像帧的时间戳信息同步存储到存储器,当前图像帧的时间戳信息用于作为在将当前图像帧与其它采集数据进行融合时的参考时间。The FPGA operation unit 501 is used to determine the time stamp information of the current image frame based on the image acquisition time and the corresponding image exposure time of the current image frame; and store the current image frame and the time stamp information of the current image frame into the memory synchronously, The time stamp information of the frame is used as a reference time when fusing the current image frame with other acquired data.
其中,有关FPGA运算单元的运算逻辑可以参照上述描述内容,在此不再赘述。有关AMR处理器的读取操作在此不再赘述。Wherein, for the operation logic of the FPGA operation unit, reference may be made to the above description, and details are not repeated here. The read operation of the AMR processor will not be repeated here.
本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的撰写顺序并不意味着严格的执行顺序而对实施过程构成任何限定,各步骤的具体执行顺序应当以 其功能和可能的内在逻辑确定。Those skilled in the art can understand that in the above method of specific implementation, the writing order of each step does not mean a strict execution order and constitutes any limitation on the implementation process. The specific execution order of each step should be based on its function and possible The inner logic is OK.
基于同一发明构思,本公开实施例中还提供了与数据处理的方法对应的数据处理的装置,由于本公开实施例中的装置解决问题的原理与本公开实施例上述数据处理的方法相似,因此装置的实施可以参见方法的实施,重复之处不再赘述。Based on the same inventive concept, the embodiment of the present disclosure also provides a data processing device corresponding to the data processing method. Since the problem-solving principle of the device in the embodiment of the present disclosure is similar to the above-mentioned data processing method of the embodiment of the present disclosure, therefore For the implementation of the device, reference may be made to the implementation of the method, and repeated descriptions will not be repeated.
参照图6所示,为本公开实施例提供的一种数据处理的装置的示意图,装置包括:获取模块601、确定模块602、存储模块603;其中,Referring to FIG. 6 , which is a schematic diagram of a data processing device provided by an embodiment of the present disclosure, the device includes: an acquisition module 601, a determination module 602, and a storage module 603; wherein,
获取模块601,用于响应于接收到定位模块的同步触发信号,获取同步触发信号指示的图像采集时刻,以及一个或多个图像传感器之一自图像采集时刻开始采集的当前图像帧对应的图像曝光时长;The acquisition module 601 is configured to, in response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition moment indicated by the synchronous trigger signal, and the image exposure corresponding to the current image frame acquired by one or more image sensors since the image acquisition moment duration;
确定模块602,用于基于图像采集时刻以及当前图像帧对应的图像曝光时长,确定当前图像帧的时间戳信息;A determining module 602, configured to determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
存储模块603,用于将当前图像帧以及当前图像帧的时间戳信息进行同步存储,当前图像帧的时间戳信息用于作为在将当前图像帧与其它采集数据进行融合时的参考时间。The storage module 603 is configured to store the current image frame and the time stamp information of the current image frame synchronously, and the time stamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
采用上述数据处理的装置,在响应到定位模块的同步触发信号的情况下,可以获取同步触发信号指示的图像采集时刻以及采集当前图像帧对应的图像曝光时长,而后基于图像采集时刻以及对应的图像曝光时长可以确定当前图像帧的时间戳信息。本公开结合图像曝光属性实现时间戳信息的确定,不同曝光属性的图像传感器所确定的时间戳也不同,这一定程度上使得所确定的时间戳信息更为贴合所抓拍目标的曝光时间,更为符合实际场景的需求。Using the above data processing device, in the case of responding to the synchronous trigger signal of the positioning module, it can obtain the image acquisition time indicated by the synchronous trigger signal and the image exposure time corresponding to the acquisition of the current image frame, and then based on the image acquisition time and the corresponding image The exposure time can determine the timestamp information of the current image frame. This disclosure realizes the determination of time stamp information in combination with image exposure attributes. The time stamps determined by image sensors with different exposure attributes are also different, which makes the determined time stamp information more suitable for the exposure time of the captured target to a certain extent. In order to meet the needs of the actual scene.
在一种可能的实施方式中,确定模块602,用于按照如下步骤基于图像采集时刻以及当前图像帧对应的图像曝光时长,确定当前图像帧的时间戳信息:In a possible implementation manner, the determination module 602 is configured to determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame according to the following steps:
基于图像采集时刻以及预设时间差间隔,确定当前图像帧中第一行图像数据的行曝光起止时刻,其中所述行曝光起止时刻是指行曝光起始时刻或行曝光终止时刻;Based on the image acquisition moment and the preset time difference interval, determine the row exposure start and end moment of the first row of image data in the current image frame, wherein the row exposure start and end moment refers to the row exposure start moment or the row exposure termination moment;
基于当前图像帧中第一行图像数据的行曝光起止时刻以及对应第一行图像数据对应的图像曝光时长,确定当前图像帧的时间戳信息。The time stamp information of the current image frame is determined based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data.
在一种可能的实施方式中,在当前图像帧的所有行图像数据同时曝光的情况下,确定模块602,用于按照如下步骤基于当前图像帧中第一行图像数据的行曝光起止时刻 以及对应第一行图像数据对应的图像曝光时长,确定当前图像帧的时间戳信息:In a possible implementation, in the case that all rows of image data in the current image frame are exposed at the same time, the determination module 602 is configured to follow the steps below based on the row exposure start and end times of the first row of image data in the current image frame and the corresponding The image exposure time corresponding to the first line of image data determines the timestamp information of the current image frame:
基于当前图像帧中第一行图像数据的行曝光起止时刻以及对应第一行图像数据的图像曝光时长,得到第一行图像数据的行曝光中心时刻;Based on the row exposure start and end moments of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data, the row exposure center moment of the first row of image data is obtained;
将第一行图像数据的行曝光中心时刻确定为当前图像帧的时间戳信息。The row exposure center moment of the first row of image data is determined as the time stamp information of the current image frame.
在一种可能的实施方式中,在当前图像帧的相邻行图像数据的行曝光起始时刻相差一个时间单位的情况下,确定模块602,用于按照如下步骤基于当前图像帧中第一行图像数据对应的行曝光起止时刻以及对应第一行图像数据对应的图像曝光时长,确定当前图像帧的时间戳信息:In a possible implementation manner, in the case that the row exposure start times of adjacent rows of image data in the current image frame differ by one time unit, the determining module 602 is configured to follow the steps below based on the first row in the current image frame The time stamp information of the current image frame is determined by the start and end time of the row exposure corresponding to the image data and the image exposure duration corresponding to the first row of image data:
基于当前图像帧中第一行图像数据的行曝光起止时刻以及对应第一行图像数据的图像曝光时长,得到第一行图像数据的行曝光中心时刻;Based on the row exposure start and end moments of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data, the row exposure center moment of the first row of image data is obtained;
基于第一行图像数据的行曝光中心时刻、以及当前图像帧中的中间行图像数据与第一行图像数据之间的时间单位数量,确定中间行图像数据的行曝光中心时刻;Determine the row exposure center moment of the middle row of image data based on the row exposure center moment of the first row of image data and the number of time units between the middle row of image data and the first row of image data in the current image frame;
将中间行图像数据的行曝光中心时刻确定为当前图像帧的时间戳信息。The row exposure center moment of the intermediate row image data is determined as the time stamp information of the current image frame.
在一种可能的实施方式中,一个人或多个图像传感器包括多个图像传感器,多个图像传感器之间同步进行图像采集;获取模块601,用于按照如下步骤响应于接收到定位模块的同步触发信号,获取同步触发信号指示的图像采集时刻:In a possible implementation manner, one person or multiple image sensors include multiple image sensors, and image acquisition is performed synchronously between the multiple image sensors; the acquisition module 601 is configured to respond to receiving the synchronization of the positioning module according to the following steps Trigger signal to obtain the image acquisition moment indicated by the synchronous trigger signal:
响应于接收到定位模块的同步触发信号,针对多个图像传感器中每个图像传感器,生成用于触发该图像传感器进行当前图像帧采集的图像采集触发信号;In response to receiving the synchronous trigger signal of the positioning module, for each image sensor in the plurality of image sensors, generate an image acquisition trigger signal for triggering the image sensor to acquire the current image frame;
将该图像采集触发信号的上升沿对应的触发时刻,确定为该图像传感器的图像采集时刻。The trigger moment corresponding to the rising edge of the image acquisition trigger signal is determined as the image acquisition moment of the image sensor.
在一种可能的实施方式中,一个或多个图像传感器包括多个图像传感器,多个图像传感器之间相差预设时间间隔进行图像采集;获取模块601,用于按照如下步骤响应于接收到定位模块的同步触发信号,获取同步触发信号指示的图像采集时刻:In a possible implementation, the one or more image sensors include a plurality of image sensors, and image acquisition is performed with a preset time interval between the plurality of image sensors; the acquisition module 601 is configured to respond to receiving the positioning according to the following steps: Synchronous trigger signal of the module, to obtain the image acquisition time indicated by the synchronous trigger signal:
响应于接收到定位模块的同步触发信号,基于多个图像传感器之间的预设时间间隔,对同步触发信号进行变频处理和/或变相处理,得到用于触发多个图像传感器分别进行当前图像帧采集的图像采集触发信号;多个图像传感器的图像采集触发信号的上升沿之间的时间间隔满足预设时间间隔;In response to receiving the synchronous trigger signal of the positioning module, based on the preset time interval between the multiple image sensors, the synchronous trigger signal is subjected to frequency conversion processing and/or phase-changing processing to obtain the current image frame used to trigger the multiple image sensors respectively The collected image acquisition trigger signal; the time interval between the rising edges of the image acquisition trigger signals of multiple image sensors meets the preset time interval;
针对多个图像传感器中的每个图像传感器,将该图像传感器的图像采集触发信 号的上升沿对应的触发时刻,确定为该图像传感器的图像采集时刻。For each image sensor in the plurality of image sensors, the trigger moment corresponding to the rising edge of the image acquisition trigger signal of the image sensor is determined as the image acquisition moment of the image sensor.
在一种可能的实施方式中,获取模块601,用于按照如下步骤确定多个图像传感器之间的预设时间间隔:In a possible implementation manner, the acquisition module 601 is configured to determine a preset time interval between multiple image sensors according to the following steps:
获取雷达设备的扫描范围;雷达设备与图像传感器针对同一场景进行采集;Obtain the scanning range of the radar device; the radar device and the image sensor collect for the same scene;
将扫描范围划分为多个扫描子范围;针对多个图像传感器中的每个图像传感器,该图像传感器负责采集多个扫描子范围中的一个扫描子范围内的图像;Dividing the scan range into multiple scan sub-ranges; for each image sensor in the multiple image sensors, the image sensor is responsible for collecting images in one scan sub-range of the multiple scan sub-ranges;
基于雷达设备针对多个扫描子范围中的每个扫描子范围的扫描时间,确定多个图像传感器中相邻两个图像传感器之间的预设时间间隔。Based on the scanning time of the radar device for each scanning sub-range in the plurality of scanning sub-ranges, a preset time interval between two adjacent image sensors among the plurality of image sensors is determined.
在一种可能的实施方式中,获取模块601,还用于响应于接收到定位模块的同步触发信号,获取惯性测量装置所捕捉的目标姿态数据的姿态捕捉时刻;In a possible implementation manner, the acquisition module 601 is further configured to acquire the attitude capture moment of the target attitude data captured by the inertial measurement device in response to receiving the synchronous trigger signal of the positioning module;
存储模块603,还用于将目标姿态数据的姿态捕捉时刻确定为目标姿态数据的时间戳信息,并将目标姿态数据以及目标姿态数据的时间戳信息进行同步存储,目标姿态数据的时间戳信息用于作为在将目标姿态数据与当前图像帧进行融合时的参考时间。The storage module 603 is also used to determine the gesture capture time of the target gesture data as the timestamp information of the target gesture data, and to store the target gesture data and the timestamp information of the target gesture data synchronously, and the timestamp information of the target gesture data is used for as the reference time when fusing the target pose data with the current image frame.
在一种可能的实施方式中,获取模块601,还用于响应于接收到定位模块的同步触发信号,获取雷达设备所扫描的点云数据的点云扫描时刻;In a possible implementation manner, the acquiring module 601 is further configured to acquire the point cloud scanning time of the point cloud data scanned by the radar device in response to receiving the synchronous trigger signal of the positioning module;
存储模块603,还用于将点云扫描时刻确定为点云数据的时间戳信息;将点云数据以及点云数据的时间戳信息进行同步存储,点云数据的时间戳信息用于作为在将点云数据与当前图像帧进行融合时的参考时间。The storage module 603 is also used to determine the point cloud scanning time as the time stamp information of the point cloud data; the point cloud data and the time stamp information of the point cloud data are stored synchronously, and the time stamp information of the point cloud data is used as The reference time when point cloud data is fused with the current image frame.
在一种可能的实施方式中,获取模块601,用于按照如下步骤获取雷达设备所扫描的点云数据的点云扫描时刻:In a possible implementation manner, the acquisition module 601 is configured to acquire the point cloud scanning time of the point cloud data scanned by the radar device according to the following steps:
将同步触发信号对应的触发时刻,确定为点云扫描时刻;或者,Determining the trigger moment corresponding to the synchronous trigger signal as the point cloud scanning moment; or,
在雷达设备中包括的第一计时单元与以太网控制器中包括的第二计时单元进行实时同步的情况下,获取第二计时单元的计时时间,并将第二计时单元的计时时间确定为点云扫描时刻;其中,第二计时单元的计时时间是在将同步触发信号对应的开始触发时刻同步到第二计时单元时,第二计时单元记录的开始计时时间。In the case that the first timing unit included in the radar device and the second timing unit included in the Ethernet controller are synchronized in real time, the timing time of the second timing unit is obtained, and the timing time of the second timing unit is determined as a point Cloud scanning time; wherein, the timing time of the second timing unit is the start timing time recorded by the second timing unit when the start triggering moment corresponding to the synchronization trigger signal is synchronized to the second timing unit.
在一种可能的实施方式中,上述装置还包括:In a possible implementation manner, the above-mentioned device also includes:
融合模块604,用于在将当前图像帧以及当前图像帧的时间戳信息进行同步存储之后,响应于接收到数据融合指令,获取基于上述数据处理的方法存储的当前图像帧以 及当前图像帧的时间戳信息;基于当前图像帧的时间戳信息,将当前图像帧与已采集数据进行融合,得到融合数据。The fusion module 604 is configured to obtain the current image frame and the time of the current image frame stored based on the above data processing method in response to receiving the data fusion instruction after synchronously storing the current image frame and the time stamp information of the current image frame Stamp information; based on the time stamp information of the current image frame, the current image frame is fused with the collected data to obtain fused data.
关于装置中的各模块的处理流程、以及各模块之间的交互流程的描述可以参照上述方法实施例中的相关说明,这里不再详述。For the description of the processing flow of each module in the device and the interaction flow between the modules, reference may be made to the relevant description in the above method embodiment, and details will not be described here.
本公开实施例还提供了一种电子设备,如图7所示,为本公开实施例提供的电子设备结构示意图,包括:处理器701、存储器702、和总线703。存储器702存储有处理器701可执行的机器可读指令(比如,图6中的装置中获取模块601、确定模块602、存储模块603对应的执行指令等),当电子设备运行时,处理器701与存储器702之间通过总线703通信,机器可读指令被处理器701执行以下步骤:The embodiment of the present disclosure also provides an electronic device, as shown in FIG. 7 , which is a schematic structural diagram of the electronic device provided by the embodiment of the present disclosure, including: a processor 701 , a memory 702 , and a bus 703 . The memory 702 stores machine-readable instructions executable by the processor 701 (for example, execution instructions corresponding to the acquisition module 601, the determination module 602, and the storage module 603 in the device in FIG. 6 ), and when the electronic device is running, the processor 701 Communicating with the memory 702 through the bus 703, the machine-readable instructions are executed by the processor 701 in the following steps:
响应于接收到定位模块的同步触发信号,获取同步触发信号指示的图像采集时刻,以及图像传感器自图像采集时刻开始采集的当前图像帧对应的图像曝光时长;In response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition time indicated by the synchronous trigger signal, and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time;
基于图像采集时刻以及当前图像帧对应的图像曝光时长,确定当前图像帧的时间戳信息;Determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
将当前图像帧以及当前图像帧的时间戳信息进行同步存储,当前图像帧的时间戳信息用于作为在将当前图像帧与其它采集数据进行融合时的参考时间。The current image frame and the timestamp information of the current image frame are stored synchronously, and the timestamp information of the current image frame is used as a reference time when fusing the current image frame with other collected data.
本公开实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行上述方法实施例中所述的数据处理的方法的步骤。其中,该存储介质可以是易失性或非易失的计算机可读取存储介质。Embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the steps of the data processing method described in the foregoing method embodiments are executed. Wherein, the storage medium may be a volatile or non-volatile computer-readable storage medium.
本公开实施例还提供一种计算机程序产品,该计算机程序产品承载有程序代码,所述程序代码包括的指令可用于执行上述方法实施例中所述的数据处理的方法的步骤,具体可参见上述方法实施例,在此不再赘述。The embodiment of the present disclosure also provides a computer program product, the computer program product carries a program code, and the instructions included in the program code can be used to execute the steps of the data processing method described in the above method embodiment, for details, please refer to the above The method embodiment will not be repeated here.
其中,上述计算机程序产品可以具体通过硬件、软件或其结合的方式实现。在一个可选实施例中,所述计算机程序产品具体体现为计算机存储介质,在另一个可选实施例中,计算机程序产品具体体现为软件产品,例如软件开发包(Software Development Kit,SDK)等等。Wherein, the above-mentioned computer program product may be specifically implemented by means of hardware, software or a combination thereof. In an optional embodiment, the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK) etc. wait.
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的***和装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。在本公开所提供的几个实施例中,应该理解到,所揭露的***、装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅 仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。Those skilled in the art can clearly understand that for the convenience and brevity of description, the specific working process of the above-described system and device can refer to the corresponding process in the foregoing method embodiments, which will not be repeated here. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices and methods may be implemented in other ways. The device embodiments described above are only illustrative. For example, the division of the units is only a logical function division. In actual implementation, there may be other division methods. For example, multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented. In another point, the mutual coupling or direct coupling or communication connection shown or discussed may be through some communication interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。The units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。In addition, each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个处理器可执行的非易失的计算机可读取存储介质中。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台电子设备(可以是个人计算机,服务器,或者网络设备等)执行本公开各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。If the functions are realized in the form of software function units and sold or used as independent products, they can be stored in a non-volatile computer-readable storage medium executable by a processor. Based on this understanding, the technical solution of the present disclosure is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make an electronic device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in various embodiments of the present disclosure. The aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .
最后应说明的是:以上所述实施例,仅为本公开的具体实施方式,用以说明本公开的技术方案,而非对其限制,本公开的保护范围并不局限于此,尽管参照前述实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本公开实施例技术方案的精神和范围,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应所述以权利要求的保护范围为准。Finally, it should be noted that: the above-mentioned embodiments are only specific implementations of the present disclosure, and are used to illustrate the technical solutions of the present disclosure, rather than limit them, and the protection scope of the present disclosure is not limited thereto, although referring to the aforementioned The embodiments have described the present disclosure in detail, and those skilled in the art should understand that any person familiar with the technical field can still modify the technical solutions described in the foregoing embodiments within the technical scope disclosed in the present disclosure Changes can be easily imagined, or equivalent replacements can be made to some of the technical features; and these modifications, changes or replacements do not make the essence of the corresponding technical solutions deviate from the spirit and scope of the technical solutions of the embodiments of the present disclosure, and should be included in this disclosure. within the scope of protection. Therefore, the protection scope of the present disclosure should be defined by the protection scope of the claims.

Claims (15)

  1. 一种数据处理的方法,其特征在于,所述方法包括:A method for data processing, characterized in that the method comprises:
    响应于接收到定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,以及一个或多个图像传感器之一自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长;In response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition time indicated by the synchronous trigger signal, and the image exposure time corresponding to the current image frame collected by one or more image sensors since the image acquisition time;
    基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;Determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
    将所述当前图像帧以及所述当前图像帧的时间戳信息进行同步存储。The current image frame and the time stamp information of the current image frame are stored synchronously.
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息,包括:The method according to claim 1, wherein the determining the timestamp information of the current image frame based on the image acquisition moment and the image exposure time corresponding to the current image frame includes:
    基于所述图像采集时刻以及预设时间差间隔,确定所述当前图像帧中第一行图像数据的行曝光起止时刻,其中所述行曝光起止时刻是指行曝光起始时刻或行曝光终止时刻;Based on the image acquisition moment and the preset time difference interval, determine the row exposure start and end moment of the first row of image data in the current image frame, wherein the row exposure start and end moment refers to the row exposure start moment or the row exposure end moment;
    基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及对应所述第一行图像数据对应的图像曝光时长,确定所述当前图像帧的时间戳信息。The timestamp information of the current image frame is determined based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data.
  3. 根据权利要求2所述的方法,其特征在于,在所述当前图像帧的所有行图像数据同时曝光的情况下,所述基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及对应所述第一行图像数据对应的图像曝光时长,确定所述当前图像帧的时间戳信息,包括:The method according to claim 2, characterized in that, in the case that all row image data of the current image frame are exposed at the same time, the start and end time of row exposure based on the first row image data in the current image frame and Corresponding to the image exposure duration corresponding to the first row of image data, determining the timestamp information of the current image frame includes:
    基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及对应所述第一行图像数据对应的图像曝光时长,得到所述第一行图像数据的行曝光中心时刻;Based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data, the row exposure center time of the first row of image data is obtained;
    将所述第一行图像数据的行曝光中心时刻确定为所述当前图像帧的时间戳信息。Determining the row exposure center moment of the first row of image data as the time stamp information of the current image frame.
  4. 根据权利要求2所述的方法,其特征在于,在所述当前图像帧的相邻行图像数据的行曝光起始时刻相差一个时间单位的情况下,所述基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及对应所述第一行图像数据对应的图像曝光时长,确定所述当前图像帧的时间戳信息,包括:The method according to claim 2, characterized in that, in the case that the row exposure starting moments of adjacent rows of image data in the current image frame differ by one time unit, the first exposure based on the current image frame The row exposure start and end times of the row image data and the image exposure duration corresponding to the first row of image data are used to determine the timestamp information of the current image frame, including:
    基于所述当前图像帧中第一行图像数据的行曝光起止时刻以及对应所述第一行图像数据对应的图像曝光时长,得到所述第一行图像数据的行曝光中心时刻;Based on the row exposure start and end times of the first row of image data in the current image frame and the image exposure duration corresponding to the first row of image data, the row exposure center time of the first row of image data is obtained;
    基于所述第一行图像数据的行曝光中心时刻、以及所述当前图像帧中的中间行图像数据与所述第一行图像数据之间的时间单位数量,确定所述中间行图像数据的行曝光中心时刻;Determine the row of the middle row image data based on the row exposure center moment of the first row image data and the number of time units between the middle row image data and the first row image data in the current image frame exposure center moment;
    将所述中间行图像数据的行曝光中心时刻确定为所述当前图像帧的时间戳信息。Determining the row exposure center moment of the intermediate row image data as the time stamp information of the current image frame.
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述一个或多个图像传感器包括多个图像传感器,所述多个图像传感器之间同步进行图像采集;所述响应于接收到定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,包括:The method according to any one of claims 1-4, wherein the one or more image sensors include a plurality of image sensors, and image acquisition is performed synchronously between the plurality of image sensors; the response to receiving Positioning the synchronous trigger signal of the module to obtain the image acquisition moment indicated by the synchronous trigger signal, including:
    响应于接收到所述定位模块的同步触发信号,针对所述多个图像传感器中每个所述图像传感器,生成用于触发该图像传感器进行图像采集的图像采集触发信号;In response to receiving the synchronous trigger signal of the positioning module, for each of the image sensors in the plurality of image sensors, generate an image acquisition trigger signal for triggering the image sensor to perform image acquisition;
    将该图像采集触发信号的上升沿对应的触发时刻,确定为该图像传感器的图像采集时刻。The trigger moment corresponding to the rising edge of the image acquisition trigger signal is determined as the image acquisition moment of the image sensor.
  6. 根据权利要求1-4任一所述的方法,其特征在于,所述一个或多个图像传感器包括多个图像传感器,所述多个图像传感器之间相差预设时间间隔进行图像采集;响应于接收到所述定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,包括:The method according to any one of claims 1-4, wherein the one or more image sensors comprise a plurality of image sensors, and image acquisition is performed with a preset time interval between the plurality of image sensors; in response to Receiving the synchronous trigger signal of the positioning module, obtaining the image acquisition moment indicated by the synchronous trigger signal, including:
    响应于接收到所述定位模块的同步触发信号,基于所述多个图像传感器之间的预设时间间隔,对所述同步触发信号进行变频处理和/或变相处理,得到用于触发所述多个图像传感器分别进行图像采集的图像采集触发信号;所述多个图像传感器的图像采集触发信号的上升沿之间的时间间隔满足所述预设时间间隔;In response to receiving the synchronous trigger signal of the positioning module, based on the preset time interval between the multiple image sensors, perform frequency conversion processing and/or phase transformation processing on the synchronous trigger signal to obtain the trigger signal for triggering the multiple image sensors. Image acquisition trigger signals for image acquisition by image sensors respectively; the time interval between the rising edges of the image acquisition trigger signals of the plurality of image sensors satisfies the preset time interval;
    针对所述多个图像传感器中的每个图像传感器,将该图像传感器的图像采集触发信号的上升沿对应的触发时刻确定为该图像传感器的图像采集时刻。For each image sensor in the plurality of image sensors, the trigger moment corresponding to the rising edge of the image acquisition trigger signal of the image sensor is determined as the image acquisition moment of the image sensor.
  7. 根据权利要求6所述的方法,其特征在于,按照如下步骤确定所述多个图像传感器之间的预设时间间隔:The method according to claim 6, wherein the preset time interval between the plurality of image sensors is determined according to the following steps:
    获取雷达设备的扫描范围;所述雷达设备与所述图像传感器针对同一场景进行采集;Obtain the scanning range of the radar device; the radar device and the image sensor collect for the same scene;
    将所述扫描范围划分为多个扫描子范围;针对所述多个图像传感器中的每个图像传感器,该图像传感器负责采集所述多个扫描子范围中的一个扫描子范围内的图像;Dividing the scanning range into a plurality of scanning sub-ranges; for each image sensor in the plurality of image sensors, the image sensor is responsible for collecting images in one scanning sub-range of the plurality of scanning sub-ranges;
    基于所述雷达设备针对所述多个扫描子范围中的每个扫描子范围的扫描时间,确定所述多个图像传感器中相邻两个图像传感器之间的预设时间间隔。Based on the scanning time of the radar device for each scanning sub-range in the plurality of scanning sub-ranges, a preset time interval between two adjacent image sensors among the plurality of image sensors is determined.
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-7, wherein the method further comprises:
    响应于接收到所述定位模块的同步触发信号,获取惯性测量装置所捕捉的目标姿态数据的姿态捕捉时刻;In response to receiving the synchronous trigger signal of the positioning module, acquiring the attitude capture moment of the target attitude data captured by the inertial measurement device;
    将所述目标姿态数据的姿态捕捉时刻确定为所述目标姿态数据的时间戳信息,并将所述目标姿态数据以及所述目标姿态数据的时间戳信息进行同步存储,所述目标姿态数 据的时间戳信息用于作为在将所述目标姿态数据与所述当前图像帧进行融合时的参考时间。Determining the gesture capture moment of the target gesture data as the timestamp information of the target gesture data, and synchronously storing the target gesture data and the timestamp information of the target gesture data, the time of the target gesture data Stamp information is used as a reference time when fusing the target pose data with the current image frame.
  9. 根据权利要求1-8任一所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-8, wherein the method further comprises:
    响应于接收到所述定位模块的同步触发信号,获取雷达设备所扫描的点云数据的点云扫描时刻;In response to receiving the synchronous trigger signal of the positioning module, acquire the point cloud scanning time of the point cloud data scanned by the radar device;
    将所述点云扫描时刻确定为所述点云数据的时间戳信息;Determining the point cloud scanning time as the time stamp information of the point cloud data;
    将所述点云数据以及所述点云数据的时间戳信息进行同步存储,所述点云数据的时间戳信息用于作为在将所述点云数据与所述当前图像帧进行融合时的参考时间。The point cloud data and the time stamp information of the point cloud data are stored synchronously, and the time stamp information of the point cloud data is used as a reference when the point cloud data is fused with the current image frame time.
  10. 根据权利要求9所述的方法,其特征在于,所述获取雷达设备所扫描的点云数据的点云扫描时刻,包括:The method according to claim 9, wherein the point cloud scanning time of the point cloud data scanned by the radar device is obtained, comprising:
    将所述同步触发信号对应的触发时刻确定为所述点云扫描时刻;或者,Determining the trigger moment corresponding to the synchronous trigger signal as the point cloud scanning moment; or,
    在所述雷达设备中包括的第一计时单元与以太网控制器中包括的第二计时单元实时同步的情况下,获取所述第二计时单元的计时时间,并将所述第二计时单元的计时时间确定为所述点云扫描时刻;其中,所述第二计时单元的计时时间是在将所述同步触发信号对应的开始触发时刻同步到所述第二计时单元时,所述第二计时单元记录的开始计时时间。In the case that the first timing unit included in the radar device is synchronized with the second timing unit included in the Ethernet controller in real time, the timing time of the second timing unit is obtained, and the The timing time is determined as the point cloud scanning moment; wherein, the timing time of the second timing unit is when the start triggering moment corresponding to the synchronization trigger signal is synchronized to the second timing unit, the second timing The start timing of the cell recording.
  11. 根据权利要求1-10任一所述的方法,其特征在于,在将所述当前图像帧以及所述当前图像帧的时间戳信息进行同步存储之后,所述方法还包括:The method according to any one of claims 1-10, wherein after synchronously storing the current image frame and the timestamp information of the current image frame, the method further comprises:
    响应于接收到数据融合指令,获取存储的所述当前图像帧以及所述当前图像帧的时间戳信息;In response to receiving a data fusion instruction, acquire the stored current image frame and timestamp information of the current image frame;
    基于所述当前图像帧的时间戳信息,将所述当前图像帧与已采集数据进行融合,得到融合数据。Based on the timestamp information of the current image frame, the current image frame is fused with the collected data to obtain fused data.
  12. 一种数据处理的装置,其特征在于,包括:A data processing device, characterized in that it comprises:
    获取模块,用于响应于接收到定位模块的同步触发信号,获取所述同步触发信号指示的图像采集时刻,以及图像传感器自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长;An acquisition module, configured to, in response to receiving the synchronous trigger signal of the positioning module, acquire the image acquisition time indicated by the synchronous trigger signal, and the image exposure time corresponding to the current image frame collected by the image sensor since the image acquisition time;
    确定模块,用于基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;A determination module, configured to determine the time stamp information of the current image frame based on the image acquisition time and the image exposure time corresponding to the current image frame;
    存储模块,用于将所述当前图像帧以及所述当前图像帧的时间戳信息进行同步存储。A storage module, configured to synchronously store the current image frame and the time stamp information of the current image frame.
  13. 一种数据处理的***,其特征在于,所述***包括:现场可编程门阵列FPGA 运算单元和ARM处理器;A system for data processing, characterized in that said system comprises: a Field Programmable Gate Array FPGA computing unit and an ARM processor;
    所述ARM处理器,用于将全球定位***GPS定位单元生成的同步触发信号所指示的图像采集时刻以及图像传感器自所述图像采集时刻开始采集的当前图像帧对应的图像曝光时长读入存储器,并将所述图像采集时刻以及所述当前图像帧对应的图像曝光时长输入到所述FPGA运算单元;The ARM processor is used to read the image acquisition time indicated by the synchronous trigger signal generated by the global positioning system GPS positioning unit and the image exposure time corresponding to the current image frame collected by the image sensor from the image acquisition time into the memory, And the image acquisition time and the image exposure duration corresponding to the current image frame are input to the FPGA computing unit;
    所述FPGA运算单元,用于基于所述图像采集时刻以及所述当前图像帧对应的图像曝光时长,确定所述当前图像帧的时间戳信息;并将所述当前图像帧以及所述当前图像帧的时间戳信息同步存储到所述存储器。The FPGA computing unit is used to determine the timestamp information of the current image frame based on the image acquisition moment and the image exposure time corresponding to the current image frame; and the current image frame and the current image frame The time stamp information is synchronously stored in the memory.
  14. 一种电子设备,其特征在于,包括:处理器、存储器和总线,所述存储器存储有所述处理器可执行的机器可读指令,当电子设备运行时,所述处理器与所述存储器之间通过总线通信,所述机器可读指令被所述处理器执行时执行如权利要求1至11任一所述的数据处理的方法的步骤。An electronic device, characterized in that it includes: a processor, a memory, and a bus, the memory stores machine-readable instructions executable by the processor, and when the electronic device is running, the connection between the processor and the memory communicate with each other through the bus, and the machine-readable instructions execute the steps of the data processing method according to any one of claims 1 to 11 when executed by the processor.
  15. 一种计算机可读存储介质,其特征在于,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行如权利要求1至11任一所述的数据处理的方法的步骤。A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, and when the computer program is run by a processor, the steps of the data processing method according to any one of claims 1 to 11 are executed .
PCT/CN2022/103025 2021-11-29 2022-06-30 Data processing method, apparatus, and system, device, and storage medium WO2023093054A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111437106.4 2021-11-29
CN202111437106.4A CN114025055A (en) 2021-11-29 2021-11-29 Data processing method, device, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
WO2023093054A1 true WO2023093054A1 (en) 2023-06-01

Family

ID=80066964

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/103025 WO2023093054A1 (en) 2021-11-29 2022-06-30 Data processing method, apparatus, and system, device, and storage medium

Country Status (2)

Country Link
CN (1) CN114025055A (en)
WO (1) WO2023093054A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116915978A (en) * 2023-08-07 2023-10-20 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114025055A (en) * 2021-11-29 2022-02-08 上海商汤临港智能科技有限公司 Data processing method, device, system, equipment and storage medium
CN116156074B (en) * 2022-11-21 2024-03-15 辉羲智能科技(上海)有限公司 Multi-camera acquisition time synchronization method
CN116156143A (en) * 2023-02-10 2023-05-23 杭州灵伴科技有限公司 Data generation method, image pickup apparatus, head-mounted display apparatus, and readable medium
CN116233391A (en) * 2023-03-03 2023-06-06 北京有竹居网络技术有限公司 Apparatus, method and storage medium for image processing
CN116461539A (en) * 2023-04-10 2023-07-21 北京辉羲智能科技有限公司 Automatic driving SoC chip for collecting time sequence data of sensor

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160316110A1 (en) * 2015-04-23 2016-10-27 Jonathan Ross Low-latency timing control
WO2018151792A1 (en) * 2017-02-16 2018-08-23 Qualcomm Incorporated Camera auto-calibration with gyroscope
US20190132516A1 (en) * 2016-05-20 2019-05-02 Sz Dji Osmo Technology Co., Ltd. Systems and methods for digital video stabalization
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微***与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor
CN110435880A (en) * 2019-08-12 2019-11-12 深圳市道通智能航空技术有限公司 A kind of collecting method, unmanned plane and storage medium
CN111726539A (en) * 2019-03-20 2020-09-29 北京初速度科技有限公司 Image timestamp determination method and device
CN111736169A (en) * 2020-06-29 2020-10-02 杭州海康威视数字技术股份有限公司 Data synchronization method, device and system
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN114025055A (en) * 2021-11-29 2022-02-08 上海商汤临港智能科技有限公司 Data processing method, device, system, equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111726538B (en) * 2019-03-20 2021-10-01 北京初速度科技有限公司 Image exposure parameter measurement system and target equipment
CN110198415B (en) * 2019-05-26 2021-08-24 初速度(苏州)科技有限公司 Method and device for determining image timestamp
WO2021159332A1 (en) * 2020-02-12 2021-08-19 深圳元戎启行科技有限公司 Image acquisition trigger method and apparatus, and computer equipment, readable storage medium and monitoring equipment
CN113496545B (en) * 2020-04-08 2022-05-27 阿里巴巴集团控股有限公司 Data processing system, method, sensor, mobile acquisition backpack and equipment
CN113038027B (en) * 2021-03-05 2022-08-19 上海商汤临港智能科技有限公司 Exposure control method, device, equipment and storage medium
CN113518162B (en) * 2021-04-07 2023-04-07 浙江大华技术股份有限公司 Line exposure method, camera and computer readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160316110A1 (en) * 2015-04-23 2016-10-27 Jonathan Ross Low-latency timing control
US20190132516A1 (en) * 2016-05-20 2019-05-02 Sz Dji Osmo Technology Co., Ltd. Systems and methods for digital video stabalization
WO2018151792A1 (en) * 2017-02-16 2018-08-23 Qualcomm Incorporated Camera auto-calibration with gyroscope
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微***与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor
CN111726539A (en) * 2019-03-20 2020-09-29 北京初速度科技有限公司 Image timestamp determination method and device
CN110435880A (en) * 2019-08-12 2019-11-12 深圳市道通智能航空技术有限公司 A kind of collecting method, unmanned plane and storage medium
CN111736169A (en) * 2020-06-29 2020-10-02 杭州海康威视数字技术股份有限公司 Data synchronization method, device and system
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN114025055A (en) * 2021-11-29 2022-02-08 上海商汤临港智能科技有限公司 Data processing method, device, system, equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116915978A (en) * 2023-08-07 2023-10-20 昆易电子科技(上海)有限公司 Trigger time determining method, data acquisition system, vehicle and industrial personal computer

Also Published As

Publication number Publication date
CN114025055A (en) 2022-02-08

Similar Documents

Publication Publication Date Title
WO2023093054A1 (en) Data processing method, apparatus, and system, device, and storage medium
WO2021031604A1 (en) Method and device for hardware time synchronization between multi-channel imus and cameras of bionic eye
CN112672415B (en) Multi-sensor time synchronization method, device, system, electronic device and medium
US9654672B1 (en) Synchronized capture of image and non-image sensor data
US11240404B2 (en) Systems and methods for synchronizing sensor capture
CN111309094A (en) Synchronous board card and method for data acquisition of sensor equipment
CN103744372B (en) The multisensor method for synchronizing time of unmanned plane electric inspection process and system
CN109587405B (en) Time synchronization method and device
CN112383675B (en) Time synchronization method and device and terminal equipment
CN111860604B (en) Data fusion method, system and computer storage medium
CN113496545B (en) Data processing system, method, sensor, mobile acquisition backpack and equipment
CN103108125B (en) A kind of capture Synchronizing Control Devices of multicamera system and method thereof
CN112945228B (en) Multi-sensor time synchronization method and synchronization device
CN111007554A (en) Data acquisition time synchronization system and method
CN111934843A (en) Multi-sensor data synchronous acquisition method for intelligent unmanned system
CN112485806B (en) Laser radar and camera time synchronization system and method
CN112861660B (en) Laser radar array and camera synchronization device, method, equipment and storage medium
CN104202534A (en) Multi-camera synchronous control device based on GPS and pulse generator and method
CN111556226A (en) Camera system
CN113014812B (en) Camera and laser radar time synchronization control system
WO2020113358A1 (en) Systems and methods for synchronizing vehicle sensors and devices
CN114614934A (en) Time synchronization triggering device and method
Osadcuks et al. Clock-based time synchronization for an event-based camera dataset acquisition platform
CN112995524A (en) High-precision acquisition vehicle, and photo exposure information generation system, method and synchronization device thereof
CN214583305U (en) Data acquisition device and unmanned system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22897150

Country of ref document: EP

Kind code of ref document: A1