CN113807282A - Data processing method and device and readable storage medium - Google Patents

Data processing method and device and readable storage medium Download PDF

Info

Publication number
CN113807282A
CN113807282A CN202111116418.5A CN202111116418A CN113807282A CN 113807282 A CN113807282 A CN 113807282A CN 202111116418 A CN202111116418 A CN 202111116418A CN 113807282 A CN113807282 A CN 113807282A
Authority
CN
China
Prior art keywords
time
vehicle
determining
angular velocity
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111116418.5A
Other languages
Chinese (zh)
Inventor
王海川
谢非
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111116418.5A priority Critical patent/CN113807282A/en
Publication of CN113807282A publication Critical patent/CN113807282A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a data processing method, a device and a readable storage medium, wherein the method comprises the following steps: acquiring a driving image associated with a vehicle; the driving image contains a lane line in a road surface on which the vehicle is driving; determining a first angular speed change curve of the vehicle in a first time interval according to a lane line included in the driving image; determining a second angular velocity profile of the vehicle in a second time interval according to the set of angular velocities of the vehicle detected by the sensor in the second time interval; and time calibration is carried out on the camera equipment and the sensor according to the curve correlation degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval. By the method and the device, calibration cost can be reduced and calibration accuracy can be improved in a time calibration scene. The application can be applied to the traffic field.

Description

Data processing method and device and readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method, device, and readable storage medium.
Background
As a space positioning method, a Visual-Inertial odometer (VIO) is widely used in the field of Virtual Reality (VR), Augmented Reality (AR), unmanned driving, and mobile robots. Typically, a visual-Inertial-odometry (VIO) system includes two sensors, a visual sensor (called a camera or a video camera) and an Inertial sensor (IMU), which do not work independently.
Generally, the IMU and camera times are considered to be synchronized and aligned when the VIO system operates, however, due to the problems of trigger delay, transmission delay, and inaccurate clock synchronization of hardware systems, time deviation usually exists between the IMU and the camera, and the erroneous identification of the IMU and camera times as synchronized and aligned affects the performance of the VIO system. In order to estimate and correct the time deviation between the camera and the IMU, thereby effectively improving the performance of the VIO system, the prior art generally adopts a calibration plate to measure the rotation of the camera, extracts an angular point to calculate the angular velocity, and then compares the angular velocity of the camera with the reading of the IMU, thereby solving the time difference between the camera and the IMU. However, this method depends very much on the calibration board and the specific calibration scenario, and it takes a lot of manpower and material resources to deploy the calibration scenario, which is costly; meanwhile, the requirement on hardware quality (such as camera imaging quality) is high, and for scenes with poor camera imaging quality, the definition of image quality shot by a camera is not high enough, angular points cannot be accurately extracted through a calibration plate at all, so that the angular speed cannot be accurately calculated, and the accuracy of the compared time difference is low.
Disclosure of Invention
The embodiment of the application provides a data processing method, data processing equipment and a readable storage medium, which can reduce calibration cost and improve calibration accuracy in a time calibration scene.
An embodiment of the present application provides a data processing method, including:
acquiring a driving image associated with a vehicle; the driving image is obtained by shooting by a camera device configured on the vehicle; the driving image contains a lane line in a road surface on which the vehicle is driving;
determining a first angular speed change curve of the vehicle in a first time interval according to a lane line included in the driving image;
determining a second angular velocity profile of the vehicle in a second time interval according to the set of angular velocities of the vehicle detected by the sensor in the second time interval;
and time calibration is carried out on the camera equipment and the sensor according to the curve correlation degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval.
An embodiment of the present application provides a data processing apparatus, including:
an image acquisition module for acquiring a travel image associated with a vehicle; the driving image is obtained by shooting by a camera device configured on the vehicle; the driving image contains a lane line in a road surface on which the vehicle is driving;
the curve determining module is used for determining a first angular speed change curve of the vehicle in a first time interval according to the lane lines included in the driving image;
a curve determination module, further configured to determine a second angular velocity variation curve of the vehicle in a second time interval according to the set of angular velocities of the vehicle detected by the sensor in the second time interval;
and the time calibration module is used for performing time calibration on the camera equipment and the sensor according to the curve association degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval.
In one embodiment, the time scaling module includes:
the correlation matching unit is used for determining the curve correlation degree between a first angular velocity change curve in a first time interval and a second angular velocity change curve in a second time interval, and matching the curve correlation degree with a correlation threshold;
the time updating unit is used for updating the time of a second time interval for mapping the second angular velocity change curve if the curve correlation degree is smaller than the correlation degree threshold value to obtain an updating time interval for remapping the second angular velocity change curve; the duration of the update time interval is equal to the duration of the second time interval;
the time calibration unit is used for carrying out time calibration on the camera equipment and the sensor according to a second angular speed change curve of the vehicle in the updated time interval and a first angular speed change curve of the vehicle in the first time interval;
and the time calibration unit is further used for taking the time matching result as a time calibration result between the camera equipment and the sensor if the curve correlation degree is greater than the correlation degree threshold value.
In one embodiment, the second curve of angular velocity variation is in a curvilinear coordinate system, the curvilinear coordinate system comprising a time axis;
a time update unit comprising:
the curve translation subunit is used for acquiring the translation amount, and translating a second angular velocity change curve in a second time interval in the curve coordinate system according to the translation amount and the target translation direction; the target translation direction and the axis direction of the time axis belong to a parallel relation;
the time stamp obtaining subunit is configured to obtain a curve starting position and a curve ending position of the translated second angular velocity change curve in a curve coordinate system, and obtain a starting time stamp corresponding to the curve starting position and an ending time stamp corresponding to the curve ending position on a time axis;
and the interval determining subunit is used for determining a time interval formed by the starting time stamp and the ending time stamp as an updating time interval.
In an embodiment, the time calibration unit is further specifically configured to determine an update curve association degree between a second angular velocity change curve in the update time interval and a first angular velocity change curve in the second time interval, and match the update curve association degree with an association degree threshold;
and the time calibration unit is further specifically configured to determine a time difference between the update time interval and the second time interval if the correlation degree of the update curve is greater than the correlation degree threshold, and determine the time difference as a time calibration result between the camera device and the sensor.
In one embodiment, the first driving image pair and the second driving image pair are included in the driving image; the first driving image pair comprises a first driving sub-image and a second driving sub-image; the second driving image pair comprises a third driving sub image and a fourth driving sub image;
a curve determination module comprising:
the vehicle driving sub-image processing unit is used for processing the first driving sub-image and the second driving sub-image to obtain a first driving sub-image; the first shooting time is determined based on a first shooting time and a second shooting time, the first shooting time is the time when the camera device shoots the first driving subimage, and the second shooting time is the time when the camera device shoots the second driving subimage;
the angular velocity determining unit is further used for determining a second angular velocity corresponding to the vehicle at a second moment according to lane lines respectively included in the third driving sub-image and the fourth driving sub-image; the second moment is determined according to a third shooting moment and a fourth shooting moment, the third shooting moment is the moment when the camera equipment shoots a third driving sub image, and the fourth shooting moment is the moment when the camera equipment shoots a fourth driving sub image;
and the curve determining unit is used for determining a first angular velocity change curve corresponding to the vehicle according to the first angular velocity, the second angular velocity, the first moment and the second moment.
In one embodiment, the angular velocity determination unit includes:
the offset degree determining subunit is used for determining a first offset angle corresponding to the vehicle at the first shooting moment according to the lane line included in the first driving sub-image; the first offset angle is the offset angle between the driving direction of the vehicle at the first shooting moment and the lane line;
the offset degree determining subunit is further configured to determine a second offset angle corresponding to the vehicle at the second shooting time according to the lane line included in the second driving sub-image; the second offset angle is the offset angle between the driving direction of the vehicle at the second shooting moment and the lane line;
and the angular speed determining subunit is used for determining a first angular speed corresponding to the vehicle at the first moment according to the first offset angle, the second offset angle, the first shooting moment and the second shooting moment.
In an embodiment, the offset determining subunit is further specifically configured to identify a lane line included in the first driving sub-image, and determine a pixel coordinate of a pixel point corresponding to the lane line in the image coordinate system; the image coordinate system is a coordinate system corresponding to the first driving subimage;
the offset degree determining subunit is further specifically configured to perform coordinate conversion on the pixel coordinate according to the initial pitch angle value to obtain a spatial position coordinate of the pixel point in a world coordinate system;
and the offset degree determining subunit is further specifically configured to perform straight line fitting on the spatial position coordinates to obtain a fitted straight line corresponding to the pixel point, and determine a first offset angle corresponding to the vehicle at the first shooting time according to the fitted straight line.
In one embodiment, the offset degree determining subunit is further specifically configured to obtain a vertical distance corresponding to the camera device; the vertical distance is a straight-line distance between the camera equipment and the road ground;
and the offset degree determining subunit is further specifically configured to determine a rotation matrix corresponding to the camera device according to the initial pitch angle value, and determine spatial position coordinates of the pixel points in a world coordinate system according to the rotation matrix, the vertical distance, and the pixel coordinates.
In one embodiment, the lane lines included in the first travel sub-image include N lane lines, and the first travel sub-image corresponds to N fitted straight lines;
the offset degree determining subunit is further specifically configured to obtain a slope of a line corresponding to each of the N fitting lines;
the offset degree determining subunit is further specifically configured to sort the N linear slopes according to a size order to obtain a linear slope sequence;
the offset degree determining subunit is further specifically configured to obtain a first straight-line slope and a second straight-line slope in sequence in the straight-line slope sequence, determine a fitting straight line corresponding to the first straight-line slope as a first target fitting straight line, and determine a fitting straight line corresponding to the second straight-line slope as a second target fitting straight line;
the offset degree determining subunit is further specifically configured to determine a first offset angle corresponding to the vehicle at the first shooting time according to the first target fitting straight line and the second target fitting straight line.
In one embodiment, the offset determining subunit is further specifically configured to determine a vertical projection ground position of the camera device on a horizontal ground of the road, and determine the vertical projection ground position as a world coordinate origin of the world coordinate system;
the offset degree determining subunit is further specifically configured to determine an intersection point between the first target fitting straight line and the second target fitting straight line, and a distance between the intersection point and the world coordinate origin;
the offset degree determining subunit is further specifically configured to determine a linear relationship between the first target fitting straight line and the second target fitting straight line according to the distance;
the offset degree determining subunit is further specifically configured to determine, according to the straight-line relationship, a first offset angle corresponding to the vehicle at the first shooting time.
In an embodiment, the offset determining subunit is further specifically configured to determine, if the straight line relationship is a parallel relationship, that the initial pitch angle value is a correct pitch angle value, and determine, as the first offset angle, an included angle between the driving direction of the vehicle at the first shooting time and the first target fitting straight line;
the offset degree determining subunit is further specifically configured to, if the linear relationship is a non-parallel relationship, adjust the initial pitch angle value, perform coordinate conversion on the pixel coordinate according to the adjusted pitch angle value to obtain an updated spatial position coordinate of the pixel point in the world coordinate system, perform linear fitting on the updated spatial position coordinate to obtain an updated fitted straight line corresponding to the pixel point, and determine a first offset angle corresponding to the vehicle at the first shooting time according to the updated fitted straight line.
In one embodiment, the first shooting moment and the second shooting moment are adjacent shooting moments;
the angular velocity determining subunit is further specifically configured to determine an absolute value of an angle difference between the first offset angle and the second offset angle;
the angular velocity determining subunit is further specifically configured to acquire an intermediate shooting time between the first shooting time and the second shooting time, and determine the intermediate shooting time as the first time;
the angular speed determining subunit is further specifically configured to determine a shooting interval duration between the first shooting time and the second shooting time, and determine a first angular speed of the vehicle at the first time according to the absolute value of the angle difference and the shooting interval duration.
In one embodiment, the curve determining unit includes:
the initial curve determining subunit is configured to determine an initial angular velocity variation curve corresponding to the vehicle according to the first time, the first angular velocity, the second time, and the second angular velocity;
the mean value data determining subunit is used for determining a mean value moment between the first moment and the second moment, and determining a mean value angular velocity between the first angular velocity and the second angular velocity as an angular velocity corresponding to the vehicle at the mean value moment;
and the curve determining subunit is used for determining a first angular velocity change curve corresponding to the vehicle according to the first moment, the first angular velocity, the second moment, the second angular velocity, the mean moment and the mean angular velocity.
An aspect of an embodiment of the present application provides a computer device, including: a processor and a memory;
the memory stores a computer program that, when executed by the processor, causes the processor to perform the method in the embodiments of the present application.
An aspect of the embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, where the computer program includes program instructions, and the program instructions, when executed by a processor, perform the method in the embodiments of the present application.
In one aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by one aspect of the embodiments of the present application.
In the embodiment of the application, a driving image of a vehicle driving in the road ground can be shot through the camera device, wherein the driving image comprises a lane line. Then, through the lane lines in the driving image, a first angular speed change curve of the vehicle in a first time interval can be determined; the sensor in the vehicle can also detect the angular speed set of the vehicle, and a second angular speed change curve of the vehicle in a second time interval can be determined according to the angular speed set; the curve correlation degree (i.e., the curve correlation degree) between the first angular velocity change curve and the second angular velocity change curve can be calculated, and then the camera device and the sensor can be calibrated in time. Therefore, when time calibration is carried out between the camera equipment and the sensor, no additional equipment is needed, no special external environment is needed, only a lane line is needed, flexibility is high, and calibration cost is greatly reduced; meanwhile, the curve association degree between the angular speed change curve corresponding to the camera equipment and the angular speed change curve corresponding to the sensor is determined to perform the time calibration mode, and the time calibration can be performed under the condition that the correlation between the angular speed shot by the camera equipment and the angular speed detected by the sensor is high (namely the angular speed is extremely correlated), so that the obtained time calibration result has high accuracy. In conclusion, the method and the device can improve calibration flexibility and accuracy and reduce calibration cost in a scene of time calibration of the camera equipment and the sensor.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a diagram of a network architecture provided by an embodiment of the present application;
fig. 2 is a schematic flowchart of a data processing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an angular velocity profile provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a translation angular velocity variation curve for time calibration according to an embodiment of the present application;
FIG. 5 is a flow chart of data processing provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a curve smoothing process provided in an embodiment of the present application;
FIG. 7 is a schematic flow chart illustrating a process for determining an offset angle of a vehicle according to an embodiment of the present disclosure;
FIG. 8a is a schematic diagram of a scenario for determining an offset angle according to an embodiment of the present application;
FIG. 8b is a schematic diagram illustrating a scenario of determining an offset angle according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The present application relates to the field of intelligent transportation, and for ease of understanding, intelligent transportation and its related concepts will be first described.
An Intelligent Vehicle Infrastructure Cooperative System (IVICS), referred to as a Vehicle Infrastructure Cooperative system for short, is a development direction of an Intelligent Transportation System (ITS). The vehicle-road cooperative system adopts the advanced wireless communication, new generation internet and other technologies, implements vehicle-vehicle and vehicle-road dynamic real-time information interaction in all directions, develops vehicle active safety control and road cooperative management on the basis of full-time dynamic traffic information acquisition and fusion, fully realizes effective cooperation of human and vehicle roads, ensures traffic safety, improves traffic efficiency, and thus forms a safe, efficient and environment-friendly road traffic system.
The present application relates to Advanced Driving Assistance Systems (ADAS) and Visual-Inertial Navigation Systems (VINS) in an intelligent vehicle-road coordination System.
The ADAS uses various sensors (such as millimeter wave radar, laser radar, single/binocular camera, satellite navigation, etc.) installed on the vehicle to sense the surrounding environment at any time during the driving process of the vehicle, collect data, perform identification, detection and tracking of static and dynamic objects, and perform systematic operation and analysis by combining with navigation map data, thereby letting drivers perceive the danger that may occur in advance, and effectively increasing the comfort and safety of vehicle driving.
A Visual Inertial navigation system (or called Visual-Inertial odometer, VIO) is a computer vision technique that uses a camera and an Inertial Measurement Unit (IMU) as sensors to measure and then estimate displacement and attitude changes, and implements instant positioning and mapping (SLAM) mainly by fusing camera and IMU data. The camera sensor can capture photons within a certain exposure time to generate an electric signal so as to obtain a digital picture, so that abundant and accurate information can be obtained under slow motion, and the IMU can measure the angular velocity and the linear acceleration of the body and is not influenced by the external environment. The camera is susceptible to exposure time, aperture (determining light quantity), and working distance because it relies on capturing photons with a certain exposure time, and the output frame rate of the camera is typically 100HZ for low material (such as white wall, snowfield, etc.), high speed scene (prone to motion blur), and high dynamic range, where the position cannot be accurately estimated. In contrast, the IMU has higher accuracy in a fast scene, and has lower accuracy in a low-speed scene due to the existence of drift and measurement noise, and the output frequency of the IMU is much higher than that of the camera, generally at 1000 HZ. Therefore, the camera sensor and the IMU are complementary in a use scene, and more accurate navigation and positioning can be realized by fusing camera information and IMU information.
The VIO, as a spatial positioning method, is widely applied to spatial positioning in VR/AR field, unmanned field and mobile robot field, or navigation for vehicles. In order to better fuse the camera sensor information and the IMU sensor information, the present application mainly provides a method for time calibration of a camera and an IMU in a VIO system.
Referring to fig. 1, fig. 1 is a diagram of a network architecture according to an embodiment of the present disclosure. As shown in fig. 1, the network architecture may include a service server 1000 and a terminal device cluster, and the terminal device cluster may include one or more terminal devices, where the number of terminal devices is not limited herein. As shown in fig. 1, the plurality of terminal devices may include a terminal device 100a, a terminal device 100b, a terminal device 100c, …, a terminal device 100 n; as shown in fig. 1, the terminal device 100a, the terminal device 100b, the terminal devices 100c, …, and the terminal device 100n may be respectively in network connection with the service server 1000, so that each terminal device may perform data interaction with the service server 1000 through the network connection.
According to the embodiment of the application, one terminal device can be selected from a plurality of terminal devices to serve as the target terminal device, and the terminal device can be an intelligent vehicle-mounted device, but the application is not limited to this. The intelligent vehicle-mounted device can be deployed in a vehicle (such as a bus, a car, a truck, and the like). For example, the terminal device 100a shown in fig. 1 may be used as the target terminal device, which may be disposed in a vehicle, and the vehicle may also be disposed with a camera device (also referred to as a visual sensor or a camera device) and an inertial measurement unit (also referred to as an inertial sensor or an IMU). The camera device can capture images (which may be called driving images) of vehicles driving on the road surface, and the IMU can also measure corresponding angular speed sets of the vehicles at different times. The driving image captured by the camera device and the set of angular velocities measured by the IMU at different times may be used as service data, and the target terminal device may send the service data to the service server 1000 through network connection.
Subsequently, the service server 1000 may determine a first angular velocity variation curve of the vehicle in a first time interval according to the lane lines included in the driving image (i.e., the lane lines on the road surface in the driving image); similarly, the service server 1000 may also determine a second angular velocity variation curve of the vehicle in a second time interval (a time interval formed by different times corresponding to the angular velocity set measured by the IMU) according to the angular velocity set measured by the IMU at different times. The service server 1000 may calculate a degree of correlation (which may be referred to as a curve correlation degree) between the first angular velocity change curve and the second angular velocity change curve, and may perform time calibration on the camera device (i.e., the camera sensor) and the IMU sensor according to the degree of correlation, so that information of the camera sensor and information of the IMU sensor may be better fused for more accurate positioning or navigation. For a specific implementation manner of determining the first angular velocity variation curve and performing time calibration on the camera device and the IMU sensor, reference may be made to the description in the embodiment corresponding to fig. 2.
It is understood that the method provided by the embodiment of the present application may be executed by a computer device, which includes but is not limited to a terminal device or a service server. The service server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, big data and an artificial intelligence platform.
The terminal device and the service server may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
Alternatively, it is understood that the computer device (the service server 1000, the terminal device 100a, the terminal device 100b, and the like) may be a node in a distributed system, wherein the distributed system may be a blockchain system, and the blockchain system may be a distributed system formed by connecting a plurality of nodes through a network communication mode. The P2P Protocol is an application layer Protocol operating on a Transmission Control Protocol (TCP). In a distributed system, any form of computer device, such as a business server, an electronic device such as a terminal device, etc., may become a node in the blockchain system by joining the peer-to-peer network. For ease of understanding, the concept of blockchains will be explained below: the block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism and an encryption algorithm, and is mainly used for sorting data according to a time sequence and encrypting the data into an account book, so that the data cannot be falsified or forged, and meanwhile, the data can be verified, stored and updated. When the computer device is a block chain node, due to the characteristics that the block chain cannot be tampered and the characteristics of forgery prevention, the data (such as a driving image, an angular velocity change curve, a time calibration result and the like) in the application can have authenticity and safety, and therefore the result obtained after relevant data processing is carried out on the basis of the data is more reliable.
Further, please refer to fig. 2, and fig. 2 is a schematic flowchart of a data processing method according to an embodiment of the present application. The data processing method may be executed by a service server (e.g., the service server 1000 in the embodiment corresponding to fig. 1) or may be executed by a terminal device (e.g., any terminal device in a terminal device cluster in the embodiment corresponding to fig. 1, such as the terminal device 100 a); the data processing method can also be executed by the terminal device and the service server together, and for the convenience of understanding, the data processing method is executed by the terminal device as an example. As shown in fig. 2, the flow may include at least the following steps S101 to S104:
step S101, acquiring a running image associated with a vehicle; the driving image is obtained by shooting by a camera device configured on the vehicle; the driving image contains lane lines in the road surface on which the vehicle is driving.
In the application, the camera device may be a visual sensor having a shooting function, such as a camera and a vehicle data recorder, and the camera device may be deployed in a vehicle and may be used to shoot a driving environment of the vehicle in a driving process on a road surface to obtain a driving image. The terminal device may include a vehicle-mounted computer, a computer, or the like having a computing function. The terminal device may be connected to a camera device of a vehicle. The terminal device may send a request to the camera device to cause the camera device to feed back images (which may be referred to as driving images) that it takes while the vehicle is driving on the road surface (which may include lane lines that are in a parallel relationship in the road surface). The camera equipment can also actively send the driving image to the terminal equipment after shooting and collecting the driving image, so that the terminal equipment can receive the driving image.
The terminal device may be disposed on a vehicle, and the vehicle may be a vehicle (such as an autonomous Driving vehicle, and may also be a vehicle provided with an Advanced Driving Assistance System (ADAS)), an unmanned aerial vehicle, and the like.
It should be understood that the terminal device may further have a sensing function, and when the vehicle actually runs on the road, the terminal device may identify objects around the vehicle based on the environment image collected by the camera device, so as to determine the running strategy of the vehicle.
It is understood that the image captured by the camera device may be a video image, and the terminal device may use each frame of image in the video image as a driving image (i.e., an image to be processed), or screen out an image to be processed from the video image as a driving image, for example, capture an image every few frames as a driving image to be processed. Certainly, in actual application, the image acquired by the camera device may also be a picture, the camera device may take a picture at intervals (e.g., every few milliseconds, every few seconds, every few minutes, etc.), and the picture may be sent to the terminal device as a driving image.
It can be understood that, assuming that the driving images collected by the terminal device are pictures, the vehicle can be controlled to perform lane change driving (for example, two consecutive lane change driving) at a high speed (for example, 40km/s) on the road surface including the lane line, and in the process, the camera device is set to take one frame of picture (called driving image) every 2 seconds, so that a plurality of driving images taken at different times can be obtained, and the camera device can send the plurality of driving images taken at different times to the terminal device.
Step S102, determining a first angular speed change curve of the vehicle in a first time interval according to the lane lines included in the driving image.
In the application, after the terminal device acquires a plurality of driving images shot by the camera device at different times, a time interval (which can be called as a first time interval) can be formed according to the different times, and according to lane lines included in each driving image, the angular speed of the vehicle at each time in the first time interval can be determined. A coordinate system including a time axis and an angular velocity axis may be established, and an angular velocity profile (which may be referred to as a first angular velocity profile) may be drawn in the coordinate system according to the corresponding angular velocity of the vehicle at each time within the first time interval.
To facilitate understanding of the first angular velocity variation curve, please refer to fig. 3 together, and fig. 3 is a schematic diagram of an angular velocity variation curve provided in an embodiment of the present application. As shown in fig. 3, taking the example of establishing a planar rectangular coordinate system (a coordinate system formed by two axes perpendicular to each other and having a common origin point on the same plane), the two axes in the planar rectangular coordinate system are respectively placed in the horizontal position and the vertical position, and the right direction and the upward direction are respectively positive directions of the two axes. Among them, the number axis in the horizontal direction may be referred to as an x axis (or a horizontal axis); the number axis perpendicular to the x axis may be referred to as the y axis (or longitudinal axis). It should be understood that the x-axis and the y-axis may be collectively referred to as coordinate axes, and any one of the x-axis and the y-axis may be taken as a time axis (while the other coordinate axis is taken as an angular velocity axis), where the x-axis may be taken as a time axis and the y-axis may be taken as an angular velocity axis. The common origin of the x-axis and the y-axis (which may be referred to as the intersection) is referred to as the origin of the rectangular coordinate system.
As shown in fig. 3, taking an example that the camera device takes one picture every 10 seconds at a certain time (for example, a time when the vehicle starts lane change driving, 10 o 'clock at 14 o' clock at 9 o 'clock at 14 o' clock at 2021), the camera device may take 5 pictures within 0 to 50s, and 5 driving images are obtained. The first time interval may be determined as [0, 50] (i.e., 0s to 50 s. if corresponding to the time, i.e., 10: 30 min 0s at 9/14/10 to 30 min 50s at 2021), the angular velocity corresponding to the vehicle at each time in the first time interval [0, 50] may be determined from the lane lines included in the 5 driving images captured by the camera device, for example, the angular velocity corresponding to the vehicle at 0s may be determined as 0rad/s (assuming that the vehicle does not drive (or does not change lanes drive) at 0s, the angular velocity corresponding to the vehicle at 0s may be 0, the angular velocity corresponding to the vehicle at 5s may be 0.025rad/s (the unit may be rad/s), the angular velocity corresponding to the vehicle at 10s may be 0.1rad/s, the angular velocity corresponding to the 15s may be 0.2rad/s, …, at 50 th second, the corresponding angular speed is 0.07 rad/s.
A time instant and its corresponding angular velocity can be combined into one coordinate, e.g. the 5 th second and the corresponding angular velocity 0.025rad/s can be combined into one coordinate (5, 0.025). Therefore, a plurality of coordinates can be obtained, the points corresponding to the coordinates can be obtained in the planar rectangular coordinate system, and then the points can be smoothly connected, so that the angular velocity variation curve as shown in fig. 3 can be drawn, and the angular velocity variation curve can be referred to as a first angular velocity variation curve. It should be noted that the time interval (every 10s), the time interval ([0, 50]), the angular velocity variation curve, and the like of the camera capturing the image provided in the embodiment corresponding to fig. 3 are examples for easy understanding, and do not have actual reference meanings.
For the above-mentioned determining the angular velocity of the vehicle at each time within the first time interval according to the lane line included in each driving image, and then determining the specific implementation manner of the first angular velocity variation curve, refer to the description in the embodiment corresponding to fig. 5.
Step S103, determining a second angular speed change curve of the vehicle in a second time interval according to the angular speed set of the vehicle detected by the sensor in the second time interval.
In the present application, the sensors may be inertial sensors, wherein the inertial sensors may be sensors that detect and measure acceleration, tilt, shock, vibration, rotation, and multiple degree of freedom (DoF) motion. Inertial sensors are important components to address navigation, orientation, and motion carrier control. Inertial sensors may include accelerometers, angular rate sensors (also known as gyroscopes), Attitude and Heading Reference Systems (AHRS), Inertial Measurement Units (IMUs), and so forth. Among them, the acceleration sensor is a sensor that measures an inertial force using a sensing mass, and generally includes a proof mass (a sensing element) and a detection circuit. The gyroscope is manufactured by utilizing the principle that the Coriolis effect of a vibrating mass block driven by a base (a shell) to rotate is utilized to sense the angular velocity, and the gyroscope mainly adopts a frame driving type (an inner frame and an outer frame), a comb driving type, an electromagnetic driving type and the like. The AHRS is mainly composed of a three-axis gyroscope, an acceleration sensor, and a magnetic sensor, and is resolved according to a four-element method, and a pitch angle, a roll angle, and a course angle of a moving body can be directly output. The IMU mainly comprises three acceleration sensors, three gyroscopes and a resolving circuit, is mainly used for measuring the angular velocity and the linear acceleration of the body and is not influenced by the outside. The inertial sensor in this application may refer to an inertial sensor capable of measuring an angular velocity of the body, such as an angular velocity sensor (also referred to as a gyroscope), an IMU, and the like.
The inertial sensor may be deployed in a vehicle, and when the vehicle is driving through a fast lane change, the inertial sensor may measure an angular velocity of the vehicle at each time, and the terminal device may be connected to the inertial sensor, so as to read an angular velocity value measured by the inertial sensor at each time, so as to obtain an angular velocity set corresponding to the vehicle at each time, which is measured by the inertial sensor, and determine a second time interval according to the times (the second time interval may be the same as the first time interval, but may also be different, for example, the second time interval may be slightly smaller than the first time interval). According to the set of the corresponding angular velocities of the vehicle at each moment in the second time interval measured by the inertial sensor, a second angular velocity change curve of the vehicle in the second time interval can be drawn. The specific implementation manner of determining the second angular velocity change curve according to the angular velocity of the vehicle measured by the inertial sensor at each time is the same as the manner of determining the first angular velocity change curve according to the angular velocity of the vehicle determined by the lane line at each time (i.e., establishing a coordinate system and then drawing in the coordinate system), and how to draw the second angular velocity change curve will not be repeated here.
It is understood that the second angular velocity profile may be plotted in a coordinate system (e.g., a rectangular plane coordinate system) in which the first angular velocity profile is located. The first angular velocity profile and the second angular velocity profile can thus be placed in the same coordinate system, whereby a comparison of the first angular velocity profile and the second angular velocity profile can be facilitated.
And step S104, carrying out time calibration on the camera equipment and the sensor according to the curve association degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval.
In this application, after the first angular velocity variation curve and the second angular velocity variation curve are determined, a degree of correlation between the first angular velocity variation curve in the first time interval and the second angular velocity variation curve in the second time interval (which may be understood as a degree of correlation between an angular velocity (i.e., corresponding to the first angular velocity variation curve) captured by the camera device and an angular velocity (i.e., corresponding to the second angular velocity variation curve) measured by the inertial sensor) may be calculated, and this degree of correlation may be referred to as a curve correlation degree. And time calibration can be carried out on the camera equipment and the sensor according to the curve correlation degree.
For the curve correlation degree between the first angular velocity change curve in the first time interval and the second angular velocity change curve in the second time interval, a specific implementation manner of performing time calibration on the camera device and the inertial sensor may be as follows: a curve correlation between a first profile of angular velocity over a first time interval and a second profile of angular velocity over a second time interval may be determined, and then a correlation threshold may be obtained (either manually specified or machine generated by an algorithm; the correlation threshold may be presented in terms of a percentage (e.g., 90%), a fraction (e.g., 9/10), a fraction (e.g., 0.9), etc., where no limitation is to be made), and then the curve correlation may be matched to the correlation threshold; if the curve association degree is smaller than the association degree threshold, time updating can be carried out on a second time interval used for mapping a second angular speed change curve to obtain an updating time interval used for remapping the second angular speed change curve, and time calibration is carried out on the camera equipment and the sensor according to the second angular speed change curve of the vehicle in the updating time interval and the first angular speed change curve of the vehicle in the first time interval; wherein the duration of the update time interval is equal to the duration of the second time interval; and if the curve correlation degree is greater than the correlation degree threshold value, the time matching result can be used as a time calibration result between the camera equipment and the sensor.
It should be understood that when a first angular velocity change curve in a first time interval and a second angular velocity change curve in a second time interval are determined, because problems of trigger delay, transmission delay, clock inaccurate synchronization and the like exist, the time between a camera device (camera) and an inertial sensor (IMU) is likely to be unsynchronized, that is, a time difference exists between the camera device and the inertial sensor (IMU), and then the application can judge whether the time between the camera device and the inertial sensor is synchronized through the curve correlation degree between the first angular velocity change curve and the second angular velocity change curve; and when the time between the camera equipment and the inertial sensor is asynchronous, the time difference between the camera equipment and the inertial sensor is determined through the curve correlation degree, so that the camera equipment and the inertial sensor can be subjected to time calibration, the camera equipment and the inertial sensor are better fused, and more accurate positioning is realized.
It is understood that, when the first time interval is the same as the second time interval (i.e. the first time interval and the second time interval are the same time interval), when the curve correlation (i.e. the degree of correlation) between the first angular velocity variation curve in the first time interval and the second angular velocity variation curve in the second time interval is greater than (or equal to) the correlation threshold, it can be considered that the angular velocity captured by the camera device is very correlated with the angular velocity measured by the inertial sensor (the degree of coincidence is very high), and the camera device and the inertial sensor both measure the angular velocity with very high degree of coincidence in the same time interval, and then it can be considered that the time between the camera device and the inertial sensor is synchronous (i.e. the time calibration result between the camera device and the inertial sensor is the time matching result, the time difference is 0).
If the curve correlation (i.e. the correlation degree) between the second angular velocity change curves in the second time interval is smaller than the correlation threshold, it may be determined that the angular velocity captured by the camera device does not have correlation with the angular velocity measured by the inertial sensor (or the correlation degree is not high enough), at this time, the first angular velocity change curve or the second angular velocity change curve may be kept unchanged, the first angular velocity change curve or the second angular velocity change curve may be moved (moved to be more and more overlapped), by moving, the time interval of the moved angular velocity change curve may be changed, and an updated time interval may be obtained, and at the same time, the first angular velocity change curve may be more overlapped and correlated with the second angular velocity change curve, until the curve correlation between the two angular velocity change curves is greater than the correlation threshold, the time variation of the moved angular velocity variation curve at this time can be checked, and the time variation can be understood as a time difference between the update time interval and the original time interval (the time interval corresponding to the moved angular velocity variation curve (i.e., one of the first time interval and the second time interval)), that is, the time difference can be used as a time difference between the camera device and the inertial sensor, and that can be a time calibration result.
In the present application, for convenience of calculation and comparison, the first angular velocity variation curve and the second angular velocity variation curve may be plotted and displayed in the same coordinate system. When the curve association degree between the first angular velocity change curve in the first time interval and the second angular velocity change curve in the second time interval is smaller than the association degree threshold, one of the first angular velocity change curve and the second angular velocity change curve may be moved in the coordinate system to change the time interval of the one angular velocity change curve, so that the curve association degree between the two angular velocity change curves is changed.
Taking the first angular velocity variation curve and the second angular velocity variation curve in the same coordinate system (which may be referred to as a curved coordinate system) as an example, the curved coordinate system may be a rectangular plane coordinate system, and the horizontal axis and the vertical axis of the curved coordinate system are taken as time axes and angular velocity axes, respectively, as an example. When the curve correlation between the first angular velocity variation curve in the first time interval and the second angular velocity variation curve in the second time interval is smaller than the correlation threshold, the position of the first angular velocity variation curve may be kept unchanged, and the second angular velocity variation curve may be translated in the horizontal direction (in a manner parallel to the time axis), so that the time interval of the second angular velocity variation curve (i.e., the second time interval) may be updated. That is, for the time update of the second time interval for mapping the second angular velocity variation curve, a specific implementation manner of obtaining the update time interval for remapping the second angular velocity variation curve may be as follows: the translation amount can be obtained, and then a second angular velocity change curve in a second time interval in the curve coordinate system can be translated according to the translation amount and the target translation direction; the target translation direction and the axis direction of the time axis belong to a parallel relation; then, a curve starting position and a curve ending position of the translated second angular velocity change curve can be obtained in a curve coordinate system, and a starting time stamp corresponding to the curve starting position and an ending time stamp corresponding to the curve ending position are obtained on a time axis; a time interval consisting of the start time stamp and the end time stamp may be determined as the update time interval.
Further, after the second angular velocity change curve is translated, at this time, the second angular velocity change curve in the update time interval may be determined again, and the curve association degree between the second angular velocity change curve and the first angular velocity change curve in the first time interval (may be referred to as an update curve association degree) may be matched with the association degree threshold; if the correlation degree of the update curve at this time is greater than (or equal to) the correlation degree threshold value, the time difference between the update time interval and the second time interval may be determined, and the time difference may be determined as a time calibration result between the camera device and the sensor. If the correlation degree of the updated curve is smaller than the correlation degree threshold, the second angular velocity change curve may be continuously translated again (translated in the target translation direction according to the translation amount), and then the second angular velocity change curve in the new time interval may be calculated again, and the new curve correlation degree between the second angular velocity change curve and the first angular velocity change curve in the first time interval may be calculated again until the curve correlation degree is greater than the correlation degree threshold, at this time, the time translation amount (i.e., the time variation amount corresponding to the translation) corresponding to the second angular velocity change curve may be determined, and the time translation amount may be used as the time difference (i.e., the time calibration result) between the camera device and the inertial sensor.
It should be understood that the above-mentioned translation amount may refer to a single translation amount, which may be a length corresponding to a single time change amount fixed in the curved coordinate system. When the angular velocity change curve is moved each time, the length corresponding to the single time change can be translated, so that the angular velocity change curve can be translated to a new position in the curve coordinate system to obtain a new time interval. For example, if the single time change amount is 0.05s, the length corresponding to the word time change amount of 0.05s may be obtained on the time axis in the curve coordinate system, and then the angular velocity variation curve is translated in the horizontal direction (the horizontal axis is the time axis, and the vertical axis is the time axis), and the length of the translation is the length corresponding to the 0.05 s. It should be understood that when the degree of association between the two angular velocity change curves is greater than the threshold value of the degree of association through translation, the number of translations may be obtained, and the number of translations is multiplied by the time change amount of the word, and the product result may be the time difference between the camera device and the inertial sensor. Of course, the translation amount may be different each time, and when the curve correlation degree of the two angular velocity change curves is greater than the correlation degree threshold value through translation, a time difference between the last time interval and the original time interval (for example, the second time interval) may be obtained (for example, the start timestamp of the last time interval is subtracted from the start timestamp of the original time interval), and the time difference may be used as a time difference (time calibration result) between the camera and the inertial sensor. Alternatively, the time change amount of each translation may be summed up, and the resulting summation may be used as the time difference between the camera device and the inertial sensor.
For ease of understanding, please refer to fig. 4 together, and fig. 4 is a schematic diagram of a translation angular velocity variation curve for time calibration according to an embodiment of the present application. The curved coordinate system shown in fig. 4 may be a rectangular plane coordinate system in the embodiment corresponding to fig. 3, in which the curve 40a may be a curve in the embodiment corresponding to fig. 3, the curve 40a may be a first angular velocity variation curve corresponding to the camera apparatus, and a first time interval corresponding to the first angular velocity variation curve is [0, 50 ]; in the curvilinear coordinate system, a second angular velocity variation curve (i.e., curve 40b) may be plotted based on the angular velocity measured by the inertial sensor, and the second time interval corresponding to the second angular velocity variation curve is also [0, 50 ].
Further, a curve correlation between a first profile of angular velocity (i.e., curve 40a) over a first time interval [0, 50] and a second profile of angular velocity (i.e., curve 40b) over a second time interval [0, 50] may be calculated, and then the curve correlation may be compared to a correlation threshold. If the correlation of the curve is 0.8 and the correlation threshold is 0.95, then the comparison may determine that the correlation of the curve is less than the correlation threshold, and then the second curve of angular velocity (i.e., curve 40b) may be shifted.
As shown in fig. 4, the time variation of the current curve is 2.5 seconds, the length corresponding to the current time variation of 2.5 seconds is obtained in the curve coordinate system, the curve 40b is shifted by the length corresponding to the time variation of 2.5 seconds along the positive direction (horizontal direction) of the time axis, and the curve 40b reaches a new position and is located in a new time interval [2.5, 52.5 ]. At this time, the new curve relevance between the curves 40a and 40b may be calculated again to be 0.96, and after comparing the new curve relevance 0.96 with the relevance threshold 0.95, it may be determined that the new curve relevance is greater than the relevance threshold, and it may be considered that the curves 40a and 40b have a sufficiently high correlation. The angular velocity captured by the camera device in the time interval [0, 50] is correlated (can be understood to be synchronous) with the angular velocity measured by the inertial sensor in the time interval [2.5, 52.5], because the time changed by the translation curve 40b is 2.5 seconds, and then the 2.5 seconds can be taken as the time difference between the camera device and the inertial sensor, namely the 2.5 seconds is the time calibration result between the camera device and the inertial sensor.
In the embodiment of the application, a driving image of a vehicle driving in the road ground can be shot through the camera device, wherein the driving image comprises a lane line. Then, through the lane lines in the driving image, a first angular speed change curve of the vehicle in a first time interval can be determined; the sensor in the vehicle can also detect the angular speed set of the vehicle, and a second angular speed change curve of the vehicle in a second time interval can be determined according to the angular speed set; the curve correlation degree (i.e., the curve correlation degree) between the first angular velocity change curve and the second angular velocity change curve can be calculated, and then the camera device and the sensor can be calibrated in time. Therefore, when time calibration is carried out between the camera equipment and the sensor, no additional equipment is needed, no special external environment is needed, only a lane line is needed, flexibility is high, and calibration cost is greatly reduced; meanwhile, the curve association degree between the angular speed change curve corresponding to the camera equipment and the angular speed change curve corresponding to the sensor is determined to perform the time calibration mode, and the time calibration can be performed under the condition that the correlation between the angular speed shot by the camera equipment and the angular speed detected by the sensor is high (namely the angular speed is extremely correlated), so that the obtained time calibration result has high accuracy. In conclusion, the method and the device can improve calibration flexibility and accuracy and reduce calibration cost in a scene of time calibration of the camera equipment and the sensor.
Further, please refer to fig. 5, wherein fig. 5 is a schematic flow chart of data processing according to an embodiment of the present application. This process may correspond to the process of determining the angular velocity of the vehicle at each time within the first time interval and determining the first curve of the angular velocity in the embodiment corresponding to fig. 2 described above. As shown in fig. 5, the flow may include at least the following steps S501 to S503:
step S501, determining a first angular speed corresponding to the vehicle at a first moment according to lane lines respectively included in the first driving sub-image and the second driving sub-image; the first time is determined based on a first shooting time and a second shooting time, the first shooting time is a time when the camera device shoots the first driving sub-image, and the second shooting time is a time when the camera device shoots the second driving sub-image.
Specifically, in the embodiment of the present application, the driving images include a first driving image pair and a second driving image pair (the first driving image pair includes a first driving sub-image and a second driving sub-image; the second driving image pair includes a third driving sub-image and a fourth driving sub-image), according to the embodiment corresponding to fig. 2, the camera device may capture one driving image (or capture at random time) at intervals to obtain a plurality of driving images within a period, and after obtaining the plurality of driving images, the terminal device may group the driving images in pairs according to the capturing time to obtain the plurality of driving image pairs. And the travel images contained in each of the travel image pairs may be referred to as travel sub-images. The first driving image pair may be any one of a plurality of driving image pairs, and the first driving sub-image and the second driving sub-image may be driving images included in the first driving image pair. The second driving image pair may be a driving image pair whose photographing time is immediately after the first driving image pair, and the third driving image and the fourth driving image may be driving images included in the second driving image pair.
For example, the travel images include a travel image 1, a travel image 2, a travel image 3, a travel image 4, a travel image 5, and a travel image 6, wherein the travel image 1 is captured at a time earlier than the travel image 2, the travel image 2 is captured at a time earlier than the travel image 3, the travel image 3 is captured at a time earlier than the travel image 4, the travel image 4 is captured at a time earlier than the travel image 5, and the travel image 5 is captured at a time earlier than the travel image 6. In this case, the driving image 1 and the driving image 2 are combined to form a driving image pair (driving image 1, driving image 2), the driving image 2 and the driving image 3 are combined to form a driving image pair (driving image 2, driving image 3), the driving image 3 and the driving image 4 are combined to form a driving image pair (driving image 3, driving image 4), the driving image 4 and the driving image 5 are combined to form a driving image pair (driving image 4, driving image 5), and the driving image 5 and the driving image 6 are combined to form a driving image pair (driving image 5, driving image 6) at adjacent capturing times. The driving images contained in the respective driving image pairs may include driving sub-images, for example, the driving image pair (driving image 1, driving image 2) may be a first driving image pair, the driving image 1 contained in the driving image pair (driving image 1, driving image 2) may be referred to as a first driving sub-image, and the driving image 2 may be referred to as a second driving sub-image. Since the driving image pair whose imaging time is after the first driving image pair is the driving image pair (driving image 2, driving image 3) (the earliest imaging time in the driving image pair (driving image 2, driving image 3) is the imaging time of the driving image 2, the earliest imaging time in the driving image pair (driving image 1, driving image 2) is the imaging time of the driving image 1, and the imaging time of the driving image 2 is after the driving image 1), the driving image pair (driving image 2, driving image 3) can be used as the second driving image pair.
Further, a first angular velocity of the vehicle at the first time may be determined based on lane lines included in the first driving image pair and the second driving image pair, respectively. That is, the angular velocity (the radian traveled in unit time, such as the radian traveled in no second) of the vehicle at a certain time can be determined from the front and rear driving images captured by the camera device. The specific method comprises the following steps: determining a first offset angle corresponding to the vehicle at the first shooting moment according to the lane line included in the first driving sub-image; the first offset angle can be the driving direction of the vehicle at the first shooting moment and the offset angle between the lane line and the first offset angle; then, a second offset angle corresponding to the vehicle at the second shooting moment can be determined according to the lane line included in the second driving sub-image; the second offset angle is the driving direction of the vehicle at the second shooting moment and the offset angle between the lane line and the vehicle; according to the first offset angle, the second offset angle, the first shooting time and the second shooting time, the first angular speed corresponding to the vehicle at the first time can be determined. For a specific implementation manner of determining the first offset angle corresponding to the vehicle at the first shooting time according to the lane line included in the first driving sub-image, reference may be made to the description in the embodiment corresponding to fig. 6 below. For a specific implementation manner of determining the second offset angle corresponding to the second shooting time of the vehicle according to the lane line included in the second driving sub-image, reference may be made to the description of determining the first offset angle.
Further, taking the first shooting time and the second shooting time as adjacent times as an example, a specific method for determining the first angular velocity of the vehicle at the first time according to the first offset angle, the second offset angle, the first shooting time and the second shooting time may be: determining an absolute value of an angle difference between the first offset angle and the second offset angle; subsequently, an intermediate shooting time between the first shooting time and the second shooting time may be acquired, and the intermediate shooting time may be determined as the first time; then, a shooting interval duration between the first shooting time and the second shooting time may be determined, and a first angular velocity of the vehicle at the first time may be determined based on the absolute value of the angle difference and the shooting interval duration. For example, the absolute value of the angle difference may be divided by the duration of the shooting interval, and the obtained value may be used as the first angular velocity of the vehicle at the first time.
Of course, the other time (the time at the position 1/3) of the first shooting time and the second shooting time may be used as the first time for determining the first time, and the determination of the first time may be determined according to the actual scene requirement.
Step S502, determining a second angular velocity corresponding to the vehicle at a second moment according to lane lines respectively included in the third driving image and the fourth driving image; the second time is determined according to a third shooting time and a fourth shooting time, the third shooting time is the time when the camera device shoots a third driving sub-image, and the fourth shooting time is the time when the camera device shoots a fourth driving sub-image.
Specifically, for the determination of the second angular velocity, the same way as the determination of the first angular velocity may be referred to the description of determining the first angular velocity in step S501, and the description will not be repeated here.
Step S503, determining a first angular velocity variation curve corresponding to the vehicle according to the first angular velocity, the second angular velocity, the first time and the second time.
Specifically, the specific method for determining the first angular velocity variation curve corresponding to the vehicle according to the first angular velocity, the second angular velocity, the first time and the second time may be: determining an initial angular velocity change curve corresponding to the vehicle according to the first moment, the first angular velocity, the second moment and the second angular velocity; then, a mean time between the first time and the second time may be determined, and a mean angular velocity between the first angular velocity and the second angular velocity may be determined as the angular velocity of the vehicle corresponding to the mean time; then, a corresponding first angular velocity variation curve of the vehicle can be determined according to the first time, the first angular velocity, the second time, the second angular velocity, the mean time and the mean angular velocity.
It is to be understood that determining a first angular velocity of the vehicle at a first moment in time and a second angular velocity at a second moment in time is to be understood as determining an angular velocity of the vehicle at each moment in time. At this time, a coordinate system can be established to draw the angular velocity variation curve, and the angular velocity variation curve obtained by drawing can be called as an initial angular velocity variation curve. At this time, the initial angular velocity profile can be used as the final first angular velocity profile.
In a possible embodiment, in order to make the angular velocity variation curve smoother, the angular velocity variation curve may be smoothed by using a mean smoothing method. Namely: the average time between two previous times (e.g., the first time and the second time) may be determined, the angular velocity values corresponding to the two previous times may be averaged to obtain an average angular velocity, and the average angular velocity may be used as the angular velocity value corresponding to the average time, so that a plurality of angular velocity values corresponding to the average time may be obtained, and the final smoother first angular velocity variation curve may be obtained by redrawing the angular velocities corresponding to the first time, the second time, and the average time.
For ease of understanding, please refer to fig. 6 together, and fig. 6 is a schematic diagram illustrating a smoothing process performed on a curve according to an embodiment of the present application. As shown in fig. 6, a curve 60b (which may be an initial angular velocity change curve) may be plotted in a coordinate system based on the angular velocity 0 at 0s, the angular velocity 0.1 at 10s (10 seconds), the angular velocity 0.2 at 20s, the angular velocity 0.15 at 30s, the angular velocity 0.1 at 40s, and the angular velocity 0.15 at 50 s. Further, it is possible to determine that the angular velocity of the 5 th (i.e., the mean time between the 0 th and 10 th s, (10+0)/2 ═ 5) is 0.05 (i.e., the mean between 0.1 and 0, (0.1+0)/2 ═ 0.05) from the angular velocities at the 0 th and 10 th s; from the angular velocities at the 10 th and 20 th s, the angular velocity at the 15 th (i.e., the mean time between the 10 th and 20 th s, (10+20)/2 ═ 15) (i.e., the mean of the absolute values of the differences between 0.1 and 0.2, (0.2+0.1)/2 ═ 0.15) can be determined. Similarly, it can be determined that the angular velocity at the 25 th time is 0.175, the angular velocity at the 35 th time is 0.125, and the angular velocity at the 45 th time is 0.125.
Further, the curve 60b is redrawn based on the angular velocities corresponding to the 0 th, 5 th, 10 th, 15 th, 20 th, 25 th, 30 th, 35 th, 40 th, 45 th, and 50 th times, respectively. The curve 60b may be made smoother because there are more coordinate points than the curve 60 a. It should be noted that the values and curves provided by the embodiment corresponding to fig. 6 are examples for easy understanding, and are not meant to be actual references.
In the embodiment of the application, a driving image of a vehicle driving in the road ground can be shot through the camera device, wherein the driving image comprises a lane line. Then, through the lane lines in the driving image, a first angular speed change curve of the vehicle in a first time interval can be determined; the sensor in the vehicle can also detect the angular speed set of the vehicle, and a second angular speed change curve of the vehicle in a second time interval can be determined according to the angular speed set; the curve correlation degree (i.e., the curve correlation degree) between the first angular velocity change curve and the second angular velocity change curve can be calculated, and then the camera device and the sensor can be calibrated in time. Therefore, when time calibration is carried out between the camera equipment and the sensor, no additional equipment is needed, no special external environment is needed, only a lane line is needed, flexibility is high, and calibration cost is greatly reduced; meanwhile, the curve association degree between the angular speed change curve corresponding to the camera equipment and the angular speed change curve corresponding to the sensor is determined to perform the time calibration mode, and the time calibration can be performed under the condition that the correlation between the angular speed shot by the camera equipment and the angular speed detected by the sensor is high (namely the angular speed is extremely correlated), so that the obtained time calibration result has high accuracy. In conclusion, the method and the device can improve calibration flexibility and accuracy and reduce calibration cost in a scene of time calibration of the camera equipment and the sensor.
Further, please refer to fig. 7, fig. 7 is a schematic flowchart illustrating a process of determining an offset angle of a vehicle according to an embodiment of the present application. This process may correspond to the process of determining the first offset angle corresponding to the vehicle at the first shooting time according to the lane line included in the first driving sub-image in the embodiment corresponding to fig. 5. As illustrated in fig. 7, the flow may include at least the following steps S601 to S603:
step S601, identifying a lane line included in the first driving sub-image, and determining pixel coordinates of pixel points corresponding to the lane line in an image coordinate system; the image coordinate system is a coordinate system corresponding to the first driving sub-image.
Specifically, the terminal device may have an image recognition function, and specifically, a pre-trained image recognition module may be deployed in the terminal device, and the lane line in the driving image is recognized by the image recognition module.
In practical application, a large number of training images may be acquired in advance, for example, the images may include various lane lines, or the images may not include the lane lines. The marking of the lane line may be performed on each training image, and specifically, the position of the lane line in the training image may be marked. The mark can be used as a real lane line label of the training image, so that a training deep learning network with the real lane line label can be used, and the trained deep learning network can be obtained (namely the image recognition module is obtained).
It can be understood that the driving image is input into the image recognition module, and the image recognition module can extract the characteristics of each pixel point in the driving image and determine the pixel points belonging to the lane line through the characteristics of the pixel points. For example, it can be identified that a pixel belongs to a certain lane line in the driving image. Further, if the driving image includes a lane line of a dotted line type, the image recognition module may also determine, according to the extracted pixel features, a pixel point corresponding to a blank position in the dotted line lane line as a pixel point belonging to the lane line, where the blank position may be a position in the dotted line lane line where the lane line is not drawn (i.e., a position in the dotted line lane line where the line is not drawn).
In practical application, each pixel point in the driving image can have a pixel coordinate. Any vertex of the image can be used as a coordinate origin, and two intersecting image edges of the vertex can be used as coordinate axes, so that an image coordinate system can be established, each pixel point has a group of coordinate values (such as coordinate values (x, y)), and the coordinate values of the pixel points in the image coordinate system can be called as pixel coordinates. And the pixel coordinates of each pixel point on each lane line can be output through the image identification module.
In practical application, the deep learning network may be a lane line detection network (e.g., a lanonet network), and after training, the lanonet network may accurately identify the pixel points belonging to the same lane line and the pixel coordinates of each pixel point.
And step S602, performing coordinate conversion on the pixel coordinate according to the initial pitch angle value to obtain the spatial position coordinate of the pixel point in the world coordinate system.
Specifically, the world coordinate system may refer to a coordinate system in which the vehicle is located. The world coordinate system may be a spatial coordinate system, and the origin of the world coordinate system may refer to a vertical projection position (which may be understood as a position directly below) of the camera device on the road surface. When the camera device is installed on a vehicle, the camera device has a certain height (a certain distance, which can be understood as a vertical distance, i.e., a straight-line distance between the camera device and the road surface) with the road surface (horizontal surface), so that when a pixel coordinate in a driving image is converted into a spatial position coordinate in a world coordinate system, an initial pitch angle value (pitch angle parameter) of the camera device can be set, so that the spatial position coordinate in the world coordinate system can include the initial pitch angle parameter. Based on the above, the spatial position coordinates corresponding to each pixel point can be determined according to the initial pitch angle value and the height (vertical distance) of the camera. The specific method comprises the following steps: acquiring a vertical distance corresponding to camera equipment; the vertical distance refers to a straight line distance between the camera equipment and the road ground; and determining a rotation matrix corresponding to the camera equipment according to the initial pitch angle value, and determining the spatial position coordinates of the pixel points in a world coordinate system according to the rotation matrix, the vertical distance and the pixel coordinates.
For ease of understanding, please refer to formula (1), the spatial location coordinates of the determined pixel points can be determined according to formula (1):
Figure BDA0003275496200000251
wherein, P in the formula (1)wCan be used to characterize pixel coordinates in an image coordinate system; h can be used for representing the vertical distance between the camera equipment and the horizontal ground; r in formula (1) can be used to characterize a rotation matrix of the camera device, and the rotation matrix can be determined by calculating camera external parameters (such as a pitch angle, a camera installation height, and the like) of the camera device; k can be used to characterize camera intrinsic parameters (e.g., focal length, distortion, etc.) of the camera device; x and y in equation (1) may constitute spatial location coordinates.
And step S603, performing linear fitting on the spatial position coordinates to obtain a fitting linear line corresponding to the pixel point, and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the fitting linear line.
Specifically, after the spatial position coordinates corresponding to each pixel point on each lane line are determined, the spatial position coordinates can be subjected to straight line fitting to obtain a fitted straight line corresponding to each lane line. According to one or more fitting straight lines corresponding to each driving image (such as the first driving sub-image), the offset angle of the vehicle at each shooting time (such as the first shooting time) can be determined. Taking the first driving sub-image as an example, assuming that the lane lines included in the first driving sub-image include N, there are N fitting straight lines corresponding to the first driving sub-image. The specific method for determining the first offset angle corresponding to the vehicle at the first shooting moment according to the N fitting straight lines may be: the slope of each line corresponding to each fitting line in the N fitting lines can be obtained; then, sequencing the N linear slopes according to the magnitude sequence to obtain a linear slope sequence; then, a first straight line slope and a second straight line slope can be obtained in sequence in the straight line slope sequence, a fitting straight line corresponding to the first straight line slope is determined as a first target fitting straight line, and a fitting straight line corresponding to the second straight line slope is determined as a second target fitting straight line; and according to the first target fitting straight line and the second target fitting straight line, a first offset angle corresponding to the vehicle at the first shooting moment can be determined.
Specifically, a specific implementation manner for determining the first offset angle corresponding to the vehicle at the first shooting time according to the first target fitting straight line and the second target fitting straight line may be as follows: the vertical projection ground position of the camera equipment on the horizontal ground of the road can be determined, and the vertical projection ground position can be determined as the world coordinate origin of a world coordinate system; subsequently, an intersection point between the first target fitted straight line and the second target fitted straight line and a distance between the intersection point and the world coordinate origin can be determined; determining a straight line relation between the first target fitting straight line and the second target fitting straight line according to the distance; and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the straight line relation.
Specifically, a specific implementation manner for determining the first offset angle corresponding to the vehicle at the first shooting time according to the straight-line relationship may be as follows: if the straight line relation is parallel relation, the initial pitch angle value can be determined to be the correct pitch angle value, and the running direction of the vehicle at the first shooting moment and the included angle between the first target fitting straight line can be determined to be a first offset angle; and if the linear relation is a non-parallel relation, adjusting the initial pitch angle value, performing coordinate conversion on the pixel coordinate according to the adjusted pitch angle value to obtain an updated spatial position coordinate of the pixel point in a world coordinate system, performing linear fitting on the updated spatial position coordinate to obtain an updated fitting linear line corresponding to the pixel point, and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the updated fitting linear line.
For ease of understanding, please refer to fig. 8a together, and fig. 8a is a schematic view illustrating a scenario for determining an offset angle according to an embodiment of the present application. As shown in fig. 8a, the driving image 700a may be a first driving sub-image, and the driving image 700a includes a plurality of lane lines therein. For the driving image 700a, the driving image 700a can be input to an image recognition module (such as a trained Lannet network), and the image recognition module can recognize pixel points and pixel point coordinates thereof in each lane line; then, the pixel points may be projected to a horizontal ground (i.e., a road ground) according to the camera internal reference, the camera external reference, and the installation angle (i.e., the pixel coordinates are converted into control position coordinates), so as to facilitate display, the present application may calculate a top view after performing projection, where the top view is shown as a top view image 700b, the top view image 700b includes pixel points corresponding to a plurality of lane lines, and an arrow direction shown in the image 700b may be an actual advancing direction of the lane lines.
Furthermore, fitting is carried out through the space position coordinates of the pixel points belonging to the same lane line, and then a plurality of fitting straight lines can be obtained. As with image 700c, the fitted straight line may include fitted straight line 700a, fitted straight line 700b, fitted straight line 700c, fitted straight line 700d, and fitted straight line 700 e.
Further, please refer to fig. 8b together, and fig. 8b is a schematic view of a scenario for determining an offset angle according to an embodiment of the present application. As shown in fig. 8b, in the image 700c, the slope of each fitted line may be determined (for example, each fitted line may determine a linear equation according to the spatial position coordinates of the pixel points, and then determine the slope thereof according to the linear equation), among these slopes, the maximum slope and the next largest (only smaller than the maximum slope but larger than all other slopes) slope may be selected, and the fitted line corresponding to the maximum slope and the next largest slope may be used as the first target fitted line and the second target fitted line.
For example, as shown in fig. 8b, if the slope of the fitted straight line 70d is the largest and the slope of the fitted straight line 70c is the second largest, the fitted straight line 70c and the fitted straight line 70d may be selected as the first target fitted straight line and the second target fitted straight line, respectively. Further, the intersection of the fitted straight line 70c and the fitted straight line 70d may be determined, and the distance between the intersection and the world coordinate origin in the world coordinate system may be determined and recorded as distance 1. If the intersection point is not present, the intersection point may be recorded as 0 or null, and the distance 1 is also null (or negative), then the fitted line 70c may be shown parallel to the fitted line 70 d. In the actual three-dimensional space, the fitted straight line 70c and the fitted straight line 70d should be in a parallel relationship, so that it can be shown that the pitch angle value (i.e., the initial pitch angle value) of the camera apparatus set as described above is a correct value, and the spatial position coordinate obtained by the correct value is also a correct value, so that a correct fitted straight line can be obtained.
If the distance 1 is a positive value, it can be proved that the intersection point exists, the fitted straight line 70c and the fitted straight line 70d are in an intersecting relationship, but in an actual three-dimensional space, the fitted straight line 70c and the fitted straight line 70d should be in a parallel relationship, and at this time, it can be proved that the pitch angle value (i.e., the initial pitch angle value) of the camera equipment is an erroneous value, and the spatial position coordinate obtained through the spatial position coordinate is also an erroneous value, so that the initial pitch angle value can be adjusted at this time, and an adjusted pitch angle value is obtained; subsequently, the pixel coordinates of the pixel points can be converted again by using the adjusted pitch angle value to obtain new spatial position coordinates, and thus a new fitting straight line 70c and a new fitting straight line 70d can also be obtained through the new spatial position coordinates. It should be understood that the significance of adjusting the pitch angle values is to enable the straight line 70c and the straight line 70d to be constantly close to parallel. At this point, the intersection between the new straight line 70c and the new straight line 70d can be determined again, and the distance between the new intersection and the world coordinate origin can be found and recorded as distance 2. If the intersection does not exist, the new intersection may be recorded as 0 or null (or positive infinity), and the distance 2 is also null (or negative, positive infinity), then the new straight line 70c may be shown to be parallel to the new straight line 70 d. In the actual three-dimensional space, the fitted straight line 70c and the fitted straight line 70d should be in a parallel relationship, so that it can be shown that the pitch angle value (i.e., the initial pitch angle value) of the camera apparatus set as described above is a correct value, and the spatial position coordinate obtained by the correct value is also a correct value, so that a correct fitted straight line can be obtained. If the distance 2 is a positive value, it can be shown that the new fitted straight line 70c and the new fitted straight line 70d are in an intersecting relationship, at this time, the pitch angle value can be adjusted again, the coordinate conversion is performed again to obtain a new spatial position coordinate, then the straight line fitting is performed again to obtain a new straight line 70c and a new fitted straight line 70d, until the final fitted straight line 70c and the final fitted straight line 70d are determined to be in a parallel relationship by the distance, and at this time, the correct pitch angle value of the camera device can be obtained.
After the fitted straight line 70c is parallel to the fitted straight line 70d, as shown in fig. 8b, any one of the fitted straight lines 70c and 70d may be selected to determine an offset angle (which may be understood as the yaw angle of the vehicle compared to the lane line). As shown in fig. 8b, the fitting straight line 70c may be selected to determine the offset angle of the vehicle, the driving direction of the vehicle at the time (e.g., the direction of the arrow in the image 700 c) may be obtained, and the angle value of the included angle M between the driving direction and the fitting straight line 70c may be determined, where the angle of the included angle M is the yaw angle (i.e., the offset angle) of the vehicle when the driving image 700a is captured.
It should be understood that, in the embodiment of the present application, the driving image captured by the camera device is obtained, and then the yaw angle and the angular velocity of the vehicle can be calculated through the lane line in the driving image, so that the time calibration between the camera device and the inertial sensor can be performed according to the correlation between the angular velocity of the camera device and the angular velocity of the inertial sensor. The method does not need to use extra hardware equipment or require the imaging quality of camera equipment, can carry out accurate time calibration only by the existence of lane lines, and has more flexibility.
Further, please refer to fig. 9, where fig. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing means may be a computer program (comprising program code) running on a computer device, for example the data processing means being an application software; the data processing apparatus may be adapted to perform the method illustrated in fig. 3. As shown in fig. 9, the data processing apparatus 1 may include: an image acquisition module 11, a curve determination module 12 and a time calibration module 13.
An image acquisition module 11 for acquiring a travel image associated with a vehicle; the driving image is obtained by shooting by a camera device configured on the vehicle; the driving image contains a lane line in a road surface on which the vehicle is driving;
a curve determining module 12, configured to determine a first angular velocity variation curve of the vehicle in a first time interval according to a lane line included in the driving image;
a curve determining module 12, configured to determine a second angular velocity variation curve of the vehicle in a second time interval according to the set of angular velocities of the vehicle detected by the sensor in the second time interval;
and the time calibration module 13 is configured to perform time calibration on the camera device and the sensor according to a curve association degree between a first angular velocity change curve of the vehicle in a first time interval and a second angular velocity change curve of the vehicle in a second time interval.
For specific implementation manners of the image obtaining module 11, the curve determining module 12 and the time calibrating module 13, reference may be made to the descriptions of step S101 to step S104 in the embodiment corresponding to fig. 2, which will not be described herein again.
In one embodiment, the time calibration module 13 may include: a correlation matching unit 131, a time updating unit 132, and a time scaling unit 133.
A correlation matching unit 131, configured to determine a curve correlation between a first angular velocity variation curve in a first time interval and a second angular velocity variation curve in a second time interval, and match the curve correlation with a correlation threshold;
a time updating unit 132, configured to update a second time interval used for mapping the second angular velocity change curve with time if the curve correlation degree is smaller than the correlation degree threshold, so as to obtain an update time interval used for remapping the second angular velocity change curve; the duration of the update time interval is equal to the duration of the second time interval;
the time calibration unit 133 is configured to perform time calibration on the camera device and the sensor according to a second angular velocity change curve of the vehicle in the update time interval and a first angular velocity change curve of the vehicle in the first time interval;
the time calibration unit 133 is further configured to, if the curve relevance is greater than the relevance threshold, take the time matching result as a time calibration result between the camera device and the sensor.
For a specific implementation manner of the association matching unit 131, the time updating unit 132, and the time calibrating unit 133, reference may be made to the description of step S104 in the embodiment corresponding to fig. 2, which will not be described herein again.
In one embodiment, the second curve of angular velocity variation is in a curvilinear coordinate system, the curvilinear coordinate system comprising a time axis;
the time updating unit 132 may include: a curve translation sub-unit 1321, a timestamp acquisition sub-unit 1322, and a section determination sub-unit 1323.
The curve translation subunit 1321 is configured to acquire a translation amount, and translate a second angular velocity change curve in a second time interval in the curve coordinate system according to the translation amount and the target translation direction; the target translation direction and the axis direction of the time axis belong to a parallel relation;
a timestamp obtaining subunit 1322, configured to obtain a curve start position and a curve end position of the translated second angular velocity change curve in the curve coordinate system, and obtain a start timestamp corresponding to the curve start position and an end timestamp corresponding to the curve end position on the time axis;
an interval determination subunit 1323, configured to determine a time interval formed by the start timestamp and the end timestamp as the update time interval.
For a specific implementation manner of the curve translating subunit 1321, the timestamp obtaining subunit 1322, and the interval determining subunit 1323, reference may be made to the description of step S104 in the embodiment corresponding to fig. 2, which will not be described herein again.
In an embodiment, the time calibration unit 133 is further specifically configured to determine an update curve association degree between a second angular velocity variation curve in the update time interval and a first angular velocity variation curve in the second time interval, and match the update curve association degree with the association degree threshold;
the time calibration unit 133 is further specifically configured to determine a time difference between the update time interval and the second time interval if the correlation degree of the update curve is greater than the correlation degree threshold, and determine the time difference as a time calibration result between the camera device and the sensor.
In one embodiment, the first driving image pair and the second driving image pair are included in the driving image; the first driving image pair comprises a first driving sub-image and a second driving sub-image; the second driving image pair comprises a third driving sub image and a fourth driving sub image;
the curve determination module 12 may include: an angular velocity determination unit 121 and a curve determination unit 122.
An angular velocity determination unit 121 configured to determine a first angular velocity corresponding to the vehicle at a first time, based on lane lines included in the first driving sub-image and the second driving sub-image, respectively; the first shooting time is determined based on a first shooting time and a second shooting time, the first shooting time is the time when the camera device shoots the first driving subimage, and the second shooting time is the time when the camera device shoots the second driving subimage;
the angular velocity determining unit 121 is further configured to determine a second angular velocity corresponding to the vehicle at the second time according to lane lines included in the third driving sub image and the fourth driving sub image, respectively; the second moment is determined according to a third shooting moment and a fourth shooting moment, the third shooting moment is the moment when the camera equipment shoots a third driving sub image, and the fourth shooting moment is the moment when the camera equipment shoots a fourth driving sub image;
the curve determining unit 122 is configured to determine a first angular velocity variation curve corresponding to the vehicle according to the first angular velocity, the second angular velocity, the first time, and the second time.
For specific implementation of the angular velocity determining unit 121 and the curve determining unit 122, reference may be made to the description of step S102 in the embodiment corresponding to fig. 2, and details will not be repeated here.
In one embodiment, the angular velocity determination unit 121 may include: the offset degree determination subunit 1211 and the angular velocity determination subunit 1212.
An offset degree determination subunit 1211 configured to determine, from the lane lines included in the first travel sub image, a first offset angle corresponding to the vehicle at the first shooting time; the first offset angle is the offset angle between the driving direction of the vehicle at the first shooting moment and the lane line;
an offset degree determining subunit 1211, configured to determine, according to the lane line included in the second driving sub-image, a second offset angle corresponding to the vehicle at the second shooting time; the second offset angle is the offset angle between the driving direction of the vehicle at the second shooting moment and the lane line;
and the angular velocity determining subunit 1212 is configured to determine a first angular velocity corresponding to the vehicle at the first time according to the first offset angle, the second offset angle, the first shooting time, and the second shooting time.
For a specific implementation manner of the offset degree determining subunit 1211 and the angular velocity determining subunit 1212, reference may be made to the description of step S102 in the embodiment corresponding to fig. 2, which will not be described herein again.
In an embodiment, the offset determining subunit 1211 is further specifically configured to identify a lane line included in the first driving sub-image, and determine pixel coordinates of a pixel point corresponding to the lane line in the image coordinate system; the image coordinate system is a coordinate system corresponding to the first driving subimage;
the offset determining subunit 1211 is further specifically configured to perform coordinate conversion on the pixel coordinate according to the initial pitch angle value to obtain a spatial position coordinate of the pixel point in the world coordinate system;
the offset determining subunit 1211 is further specifically configured to perform straight line fitting on the spatial position coordinates to obtain a fitted straight line corresponding to the pixel point, and determine a first offset angle corresponding to the vehicle at the first shooting time according to the fitted straight line.
In an embodiment, the offset determining subunit 1211 is further specifically configured to obtain a vertical distance corresponding to the camera device; the vertical distance is a straight-line distance between the camera equipment and the road ground;
the offset determining subunit 1211 is further specifically configured to determine a rotation matrix corresponding to the camera device according to the initial pitch angle value, and determine spatial position coordinates of the pixel point in the world coordinate system according to the rotation matrix, the vertical distance, and the pixel coordinate.
In one embodiment, the lane lines included in the first travel sub-image include N lane lines, and the first travel sub-image corresponds to N fitted straight lines;
the offset determining subunit 1211 is further specifically configured to obtain a slope of each of the N fitting lines;
the offset determining subunit 1211 is further specifically configured to sort the N linear slopes according to a size order to obtain a linear slope sequence;
the offset determining subunit 1211 is further specifically configured to obtain a first straight-line slope and a second straight-line slope in sequence in the straight-line slope sequence, determine a fitting straight line corresponding to the first straight-line slope as a first target fitting straight line, and determine a fitting straight line corresponding to the second straight-line slope as a second target fitting straight line;
the offset determining subunit 1211 is further specifically configured to determine a first offset angle corresponding to the vehicle at the first shooting time according to the first target fitted straight line and the second target fitted straight line.
In one embodiment, the offset determining subunit 1211 is further specifically configured to determine a vertical projection ground position of the camera device on the horizontal ground of the road, and determine the vertical projection ground position as a world coordinate origin of the world coordinate system;
an offset determining subunit 1211, further specifically configured to determine an intersection point between the first target fitted straight line and the second target fitted straight line, and a distance between the intersection point and the world coordinate origin;
the offset degree determining subunit 1211 is further specifically configured to determine a straight line relationship between the first target fitted straight line and the second target fitted straight line according to the distance;
the offset degree determining subunit 1211 is further specifically configured to determine a first offset angle corresponding to the vehicle at the first shooting time according to the straight line relationship.
In an embodiment, the offset determining subunit 1211 is further specifically configured to determine, if the straight line relationship is a parallel relationship, that the initial pitch angle value is a correct pitch angle value, and determine, as the first offset angle, an included angle between the driving direction of the vehicle at the first shooting time and the first target fitting straight line;
the offset determining subunit 1211 is further specifically configured to, if the linear relationship is a non-parallel relationship, adjust the initial pitch angle value, perform coordinate conversion on the pixel coordinate according to the adjusted pitch angle value to obtain an updated spatial position coordinate of the pixel point in the world coordinate system, perform linear fitting on the updated spatial position coordinate to obtain an updated fitted linear line corresponding to the pixel point, and determine a first offset angle corresponding to the vehicle at the first shooting time according to the updated fitted linear line.
In one embodiment, the first shooting moment and the second shooting moment are adjacent shooting moments;
the angular velocity determining subunit 1212 is further specifically configured to determine an absolute value of an angle difference between the first offset angle and the second offset angle;
the angular velocity determining subunit 1212 is further specifically configured to acquire an intermediate shooting time between the first shooting time and the second shooting time, and determine the intermediate shooting time as the first time;
the angular velocity determining subunit 1212 is further specifically configured to determine a shooting interval duration between the first shooting time and the second shooting time, and determine a first angular velocity of the vehicle at the first time according to the absolute value of the angle difference and the shooting interval duration.
In one embodiment, the curve determining unit 122 may include: an initial curve determining subunit 1221, a mean data determining subunit 1222, and a curve determining subunit 1223.
An initial curve determining subunit 1221, configured to determine an initial angular velocity variation curve corresponding to the vehicle according to the first time, the first angular velocity, the second time, and the second angular velocity;
a mean data determining subunit 1222, configured to determine a mean time between the first time and the second time, and determine a mean angular velocity between the first angular velocity and the second angular velocity as the angular velocity corresponding to the vehicle at the mean time;
and the curve determining subunit 1223 is configured to determine a first angular velocity change curve corresponding to the vehicle according to the first time, the first angular velocity, the second time, the second angular velocity, the mean time, and the mean angular velocity.
For a specific implementation manner of the initial curve determining subunit 1221, the mean data determining subunit 1222, and the curve determining subunit 1223, reference may be made to the description in step S102 in the embodiment corresponding to fig. 2, which will not be described herein again.
In the embodiment of the application, a driving image of a vehicle driving in the road ground can be shot through the camera device, wherein the driving image comprises a lane line. Then, through the lane lines in the driving image, a first angular speed change curve of the vehicle in a first time interval can be determined; the sensor in the vehicle can also detect the angular speed set of the vehicle, and a second angular speed change curve of the vehicle in a second time interval can be determined according to the angular speed set; the curve correlation degree (i.e., the curve correlation degree) between the first angular velocity change curve and the second angular velocity change curve can be calculated, and then the camera device and the sensor can be calibrated in time. Therefore, when time calibration is carried out between the camera equipment and the sensor, no additional equipment is needed, no special external environment is needed, only a lane line is needed, flexibility is high, and calibration cost is greatly reduced; meanwhile, the curve association degree between the angular speed change curve corresponding to the camera equipment and the angular speed change curve corresponding to the sensor is determined to perform the time calibration mode, and the time calibration can be performed under the condition that the correlation between the angular speed shot by the camera equipment and the angular speed detected by the sensor is high (namely the angular speed is extremely correlated), so that the obtained time calibration result has high accuracy. In conclusion, the method and the device can improve calibration flexibility and accuracy and reduce calibration cost in a scene of time calibration of the camera equipment and the sensor.
Further, please refer to fig. 10, where fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application. As shown in fig. 10, the apparatus 1 in the embodiment corresponding to fig. 9 may be applied to the computer device 8000, and the computer device 8000 may include: a processor 8001, a network interface 8004, and a memory 8005, and the computer device 8000 further includes: a user interface 8003, and at least one communication bus 8002. The communication bus 8002 is used for connection communication between these components. The user interface 8003 may include a Display (Display) and a Keyboard (Keyboard), and the optional user interface 8003 may further include a standard wired interface and a wireless interface. The network interface 8004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). Memory 8005 may be a high-speed RAM memory or a non-volatile memory, such as at least one disk memory. Memory 8005 may optionally also be at least one storage device located remotely from the aforementioned processor 8001. As shown in fig. 10, the memory 8005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the computer device 8000 of fig. 10, a network interface 8004 may provide network communication functions; and user interface 8003 is primarily an interface for providing input to a user; and processor 8001 may be used to invoke a device control application stored in memory 8005 to implement:
acquiring a driving image associated with a vehicle; the driving image is shot by a camera device configured on the vehicle; the driving image includes lane lines in a road surface on which the vehicle is driving;
determining a first angular speed change curve of the vehicle in a first time interval according to a lane line included in the driving image;
determining a second angular velocity profile of the vehicle over a second time interval from a set of angular velocities for the vehicle detected by a sensor over the second time interval;
and time calibration is carried out on the camera equipment and the sensor according to the curve association degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval.
It should be understood that the computer device 8000 described in this embodiment may perform the description of the data processing method in the embodiment corresponding to fig. 2 to fig. 7, and may also perform the description of the data processing apparatus 1 in the embodiment corresponding to fig. 9, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, here, it is to be noted that: an embodiment of the present application further provides a computer-readable storage medium, where a computer program executed by the aforementioned data processing computer device 1000 is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the processor executes the program instructions, the description of the data processing method in the embodiments corresponding to fig. 2 to fig. 7 can be performed, so that details are not repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in embodiments of the computer-readable storage medium referred to in the present application, reference is made to the description of embodiments of the method of the present application.
The computer readable storage medium may be the data processing apparatus provided in any of the foregoing embodiments or an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, provided on the computer device. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the computer device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the computer device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
In one aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by one aspect of the embodiments of the present application.
The terms "first," "second," and the like in the description and in the claims and drawings of the embodiments of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprises" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, apparatus, product, or apparatus that comprises a list of steps or elements is not limited to the listed steps or modules, but may alternatively include other steps or modules not listed or inherent to such process, method, apparatus, product, or apparatus.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The method and the related apparatus provided by the embodiments of the present application are described with reference to the flowchart and/or the structural diagram of the method provided by the embodiments of the present application, and each flow and/or block of the flowchart and/or the structural diagram of the method, and the combination of the flow and/or block in the flowchart and/or the block diagram can be specifically implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block or blocks of the block diagram. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block or blocks of the block diagram. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block or blocks.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (16)

1. A data processing method, comprising:
acquiring a driving image associated with a vehicle; the driving image is shot by a camera device configured on the vehicle; the driving image includes lane lines in a road surface on which the vehicle is driving;
determining a first angular speed change curve of the vehicle in a first time interval according to a lane line included in the driving image;
determining a second angular velocity profile of the vehicle over a second time interval from a set of angular velocities for the vehicle detected by a sensor over the second time interval;
and time calibration is carried out on the camera equipment and the sensor according to the curve association degree between the first angular speed change curve of the vehicle in the first time interval and the second angular speed change curve of the vehicle in the second time interval.
2. The method of claim 1, wherein the time-calibrating the camera device and the sensor according to the curve correlation between the first angular velocity profile of the vehicle in the first time interval and the second angular velocity profile of the vehicle in the second time interval comprises:
determining a curve correlation degree between the first angular speed change curve in the first time interval and the second angular speed change curve in the second time interval, and matching the curve correlation degree with a correlation degree threshold value;
if the curve association degree is smaller than the association degree threshold value, updating the second time interval for mapping the second angular speed change curve in time to obtain an updated time interval for remapping the second angular speed change curve, and calibrating the time of the camera equipment and the sensor according to the second angular speed change curve of the vehicle in the updated time interval and the first angular speed change curve of the vehicle in the first time interval; the duration of the updating time interval is equal to the duration of the second time interval;
and if the curve correlation degree is greater than the correlation degree threshold value, taking a time matching result as a time calibration result between the camera equipment and the sensor.
3. The method of claim 2, wherein the second profile of angular velocity is in a curvilinear coordinate system, the curvilinear coordinate system comprising a time axis;
the time updating the second time interval for mapping the second angular velocity change curve to obtain an updated time interval for remapping the second angular velocity change curve includes:
acquiring a translation amount, and translating the second angular velocity change curve in the second time interval in the curve coordinate system according to the translation amount and a target translation direction; the target translation direction and the axis direction of the time axis belong to a parallel relation;
acquiring a curve starting position and a curve ending position of the second angular velocity change curve after translation in the curve coordinate system, and acquiring a starting time stamp corresponding to the curve starting position and an ending time stamp corresponding to the curve ending position on the time axis;
and determining a time interval formed by the starting time stamp and the ending time stamp as the updating time interval.
4. The method of claim 2, wherein the time-scaling the camera device and the sensor according to the second profile of angular velocity of the vehicle over the update time interval and the first profile of angular velocity of the vehicle over the first time interval comprises:
determining the second angular speed change curve in the updating time interval, the updating curve correlation degree between the second angular speed change curve and the first angular speed change curve in the first time interval, and matching the updating curve correlation degree with the correlation degree threshold value;
and if the correlation degree of the updating curve is greater than the correlation degree threshold value, determining a time difference value between the updating time interval and the second time interval, and determining the time difference value as a time calibration result between the camera equipment and the sensor.
5. The method of claim 1, wherein the travel images comprise a first travel image pair and a second travel image pair; the first driving image pair comprises a first driving sub-image and a second driving sub-image; the second driving image pair comprises a third driving sub image and a fourth driving sub image;
the determining a first angular speed change curve of the vehicle in a first time interval according to the lane line included in the driving image comprises:
determining a first angular speed corresponding to the vehicle at a first moment according to lane lines respectively included in the first driving sub-image and the second driving sub-image; the first shooting time is determined based on a first shooting time and a second shooting time, the first shooting time is the time when the camera equipment shoots the first driving sub-image, and the second shooting time is the time when the camera equipment shoots the second driving sub-image;
determining a second angular velocity corresponding to the vehicle at a second moment according to lane lines respectively included in the third driving sub-image and the fourth driving sub-image; the second moment is determined according to a third shooting moment and a fourth shooting moment, the third shooting moment is the moment when the camera equipment shoots the third driving sub-image, and the fourth shooting moment is the moment when the camera equipment shoots the fourth driving sub-image;
and determining a first angular speed change curve corresponding to the vehicle according to the first angular speed, the second angular speed, the first moment and the second moment.
6. The method according to claim 5, wherein determining the first angular velocity of the vehicle at the first time according to the lane lines included in the first driving sub-image and the second driving sub-image respectively comprises:
determining a first offset angle corresponding to the vehicle at the first shooting moment according to a lane line included in the first driving sub-image; the first offset angle is an offset angle between the driving direction of the vehicle at the first shooting moment and the lane line;
determining a second offset angle corresponding to the vehicle at the second shooting moment according to the lane line included in the second driving sub-image; the second offset angle is an offset angle between the driving direction of the vehicle at the second shooting time and the lane line;
and determining a first angular speed corresponding to the vehicle at a first moment according to the first offset angle, the second offset angle, the first shooting moment and the second shooting moment.
7. The method according to claim 6, wherein the determining a first offset angle corresponding to the vehicle at the first photographing time according to a lane line included in the first driving sub-image comprises:
identifying a lane line included in the first driving sub-image, and determining pixel coordinates of pixel points corresponding to the lane line in an image coordinate system; the image coordinate system is a coordinate system corresponding to the first driving sub-image;
performing coordinate conversion on the pixel coordinate according to the initial pitch angle value to obtain a spatial position coordinate of the pixel point in a world coordinate system;
and performing linear fitting on the spatial position coordinates to obtain a fitting straight line corresponding to the pixel point, and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the fitting straight line.
8. The method of claim 7, wherein the coordinate transformation of the pixel coordinate according to the initial pitch angle value to obtain the spatial position coordinate of the pixel point in the world coordinate system comprises:
acquiring a vertical distance corresponding to the camera equipment; the vertical distance is a straight-line distance between the camera equipment and the road ground;
and determining a rotation matrix corresponding to the camera equipment according to the initial pitch angle value, and determining the spatial position coordinate of the pixel point in the world coordinate system according to the rotation matrix, the vertical distance and the pixel coordinate.
9. The method according to claim 7, wherein the lane lines included in the first travel sub-image include N lane lines, the first travel sub-image corresponding to N fitted straight lines;
the determining a first offset angle corresponding to the vehicle at the first shooting moment according to the fitted straight line includes:
acquiring the slope of each fitting straight line in the N fitting straight lines;
sequencing the N linear slopes according to the size sequence to obtain a linear slope sequence;
obtaining a first straight line slope and a second straight line slope in sequence in the straight line slope sequence, determining a fitting straight line corresponding to the first straight line slope as a first target fitting straight line, and determining a fitting straight line corresponding to the second straight line slope as a second target fitting straight line;
and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the first target fitting straight line and the second target fitting straight line.
10. The method according to claim 9, wherein determining the first offset angle of the vehicle at the first shooting time according to the first target-fitted straight line and the second target-fitted straight line comprises:
determining the vertical projection ground position of the camera equipment on the horizontal ground of the road, and determining the vertical projection ground position as the world coordinate origin of the world coordinate system;
determining an intersection point between the first target fitted straight line and the second target fitted straight line and a distance between the intersection point and the world coordinate origin;
determining a straight line relation between the first target fitting straight line and the second target fitting straight line according to the distance;
and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the straight line relation.
11. The method of claim 10, wherein determining the first offset angle corresponding to the vehicle at the first shooting time according to the straight-line relationship comprises:
if the straight line relation is a parallel relation, determining that the initial pitch angle value is a correct pitch angle value, and determining an included angle between the driving direction of the vehicle at the first shooting moment and the first target fitting straight line as the first offset angle;
if the linear relation is a non-parallel relation, adjusting the initial pitch angle value, performing coordinate conversion on the pixel coordinate according to the adjusted pitch angle value to obtain an updated spatial position coordinate of the pixel point in the world coordinate system, performing linear fitting on the updated spatial position coordinate to obtain an updated fitting linear line corresponding to the pixel point, and determining a first offset angle corresponding to the vehicle at the first shooting moment according to the updated fitting linear line.
12. The method of claim 6, wherein the first and second capture moments are adjacent capture moments;
the determining a first angular velocity corresponding to the vehicle at a first moment according to the first offset angle, the second offset angle, the first shooting moment and the second shooting moment includes:
determining an absolute value of an angle difference between the first offset angle and the second offset angle;
acquiring an intermediate shooting moment between the first shooting moment and the second shooting moment, and determining the intermediate shooting moment as the first moment;
and determining the shooting interval time length between the first shooting time and the second shooting time, and determining a first angular speed corresponding to the vehicle at the first time according to the absolute value of the angle difference and the shooting interval time length.
13. The method of claim 5, wherein determining the corresponding first curve of angular velocity of the vehicle based on the first angular velocity, the second angular velocity, the first time, and the second time comprises:
determining an initial angular velocity variation curve corresponding to the vehicle according to the first time, the first angular velocity, the second time and the second angular velocity;
determining a mean time between the first time and the second time, and determining a mean angular velocity between the first angular velocity and the second angular velocity as an angular velocity corresponding to the vehicle at the mean time;
and determining a first angular speed change curve corresponding to the vehicle according to the first moment, the first angular speed, the second moment, the second angular speed, the mean moment and the mean angular speed.
14. A computer device, comprising: a processor, a memory, and a network interface;
the processor is coupled to the memory and the network interface, wherein the network interface is configured to provide network communication functionality, the memory is configured to store program code, and the processor is configured to invoke the program code to cause the computer device to perform the method of any of claims 1-13.
15. A computer-readable storage medium, in which a computer program is stored which is adapted to be loaded by a processor and to carry out the method of any one of claims 1 to 13.
16. A computer program product or computer program, characterized in that it comprises computer instructions stored in a computer-readable storage medium, said computer instructions being adapted to be read and executed by a processor, to cause a computer device having said processor to perform the method of any of claims 1-13.
CN202111116418.5A 2021-09-23 2021-09-23 Data processing method and device and readable storage medium Pending CN113807282A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111116418.5A CN113807282A (en) 2021-09-23 2021-09-23 Data processing method and device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111116418.5A CN113807282A (en) 2021-09-23 2021-09-23 Data processing method and device and readable storage medium

Publications (1)

Publication Number Publication Date
CN113807282A true CN113807282A (en) 2021-12-17

Family

ID=78896432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111116418.5A Pending CN113807282A (en) 2021-09-23 2021-09-23 Data processing method and device and readable storage medium

Country Status (1)

Country Link
CN (1) CN113807282A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115980391A (en) * 2023-03-21 2023-04-18 中国汽车技术研究中心有限公司 Acceleration sensor testing method, apparatus and medium for event data recording system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115980391A (en) * 2023-03-21 2023-04-18 中国汽车技术研究中心有限公司 Acceleration sensor testing method, apparatus and medium for event data recording system
CN115980391B (en) * 2023-03-21 2023-10-10 中国汽车技术研究中心有限公司 Acceleration sensor testing method, equipment and medium of event data recording system

Similar Documents

Publication Publication Date Title
US9880010B2 (en) Method of and arrangement for mapping range sensor data on image sensor data
US11094112B2 (en) Intelligent capturing of a dynamic physical environment
CN109583415B (en) Traffic light detection and identification method based on fusion of laser radar and camera
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
CN108932051B (en) Augmented reality image processing method, apparatus and storage medium
KR20190094405A (en) Method and system for video-based positioning and mapping
KR102200299B1 (en) A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof
JP6950832B2 (en) Position coordinate estimation device, position coordinate estimation method and program
CN111046762A (en) Object positioning method, device electronic equipment and storage medium
CN112771576A (en) Position information acquisition method, device and storage medium
KR20090064946A (en) Method and apparatus for generating virtual lane for video based car navigation system
KR102167835B1 (en) Apparatus and method of processing image
JP5214355B2 (en) Vehicle traveling locus observation system, vehicle traveling locus observation method, and program thereof
CN112365549B (en) Attitude correction method and device for vehicle-mounted camera, storage medium and electronic device
Zhou et al. Developing and testing robust autonomy: The university of sydney campus data set
Koschorrek et al. A multi-sensor traffic scene dataset with omnidirectional video
CN113240813A (en) Three-dimensional point cloud information determination method and device
CN111353453A (en) Obstacle detection method and apparatus for vehicle
JP5544595B2 (en) Map image processing apparatus, map image processing method, and computer program
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
CN113807282A (en) Data processing method and device and readable storage medium
EP3816938A1 (en) Region clipping method and recording medium storing region clipping program
CN113902047B (en) Image element matching method, device, equipment and storage medium
Schön et al. Integrated navigation of cameras for augmented reality
CN113011212A (en) Image recognition method and device and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination