CN106447730B - Parameter estimation method and device and electronic equipment - Google Patents

Parameter estimation method and device and electronic equipment Download PDF

Info

Publication number
CN106447730B
CN106447730B CN201610824392.2A CN201610824392A CN106447730B CN 106447730 B CN106447730 B CN 106447730B CN 201610824392 A CN201610824392 A CN 201610824392A CN 106447730 B CN106447730 B CN 106447730B
Authority
CN
China
Prior art keywords
frame image
position coordinate
current
determining
previous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610824392.2A
Other languages
Chinese (zh)
Other versions
CN106447730A (en
Inventor
余轶南
黄畅
都大龙
余凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Horizon Robotics Science and Technology Co Ltd
Original Assignee
Shenzhen Horizon Robotics Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Horizon Robotics Science and Technology Co Ltd filed Critical Shenzhen Horizon Robotics Science and Technology Co Ltd
Priority to CN201610824392.2A priority Critical patent/CN106447730B/en
Publication of CN106447730A publication Critical patent/CN106447730A/en
Application granted granted Critical
Publication of CN106447730B publication Critical patent/CN106447730B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Image Analysis (AREA)

Abstract

A parameter estimation method, a parameter estimation device and an electronic device are disclosed. The method comprises the following steps: acquiring a current frame image and a previous frame image acquired by an imaging device; determining a mapping relationship between the current frame image and the previous frame image; and estimating the current position coordinate of the vanishing line in the current frame image according to the mapping relation and the previous position coordinate of the vanishing line in the previous frame image. Therefore, the position coordinates of the vanishing line in the current frame image collected by the imaging device can be estimated with higher precision, and the real-time vanishing line estimation requirement is met.

Description

Parameter estimation method and device and electronic equipment
Technical Field
The present application relates to the field of image processing, and more particularly, to a parameter estimation method, apparatus, electronic device, computer program product, and computer-readable storage medium.
Background
A vehicle (e.g., a vehicle) vibrates due to bumps during movement, thereby causing a change in the position and angle of a sensor (e.g., an imaging device, etc.) equipped thereon with respect to a ground plane. This phenomenon interferes with the environmental perceptibility of various sensors including the imaging device, so that the sensors cannot provide correct sensing results, and thus, for example, cannot well assist the driver in driving operations and the like. Therefore, it is critical to accurately calibrate the above parameters of the various sensors equipped on the vehicle.
For this reason, a vanishing line estimation technique is proposed in the prior art, which can be used to calibrate the position and angle of the imaging device itself with respect to the ground plane, particularly in the pitch direction, with significant effect. For example, in the first scheme of the related art, the position coordinates of the vanishing lines therein can be directly calculated by using the image captured by the imaging device. However, the vanishing line may not be captured completely due to the occlusion of an object such as a preceding vehicle or a building, and the amount of calculation is large and the processing cost is expensive. For another example, in the second scheme of the prior art, the position coordinates of the vanishing line can also be determined by calculating the angle between the captured parallel lane lines on the road surface. However, this scheme also fails to provide an all-weather vanishing line estimation function, since the lane lines may not be complete on some roads, or even exist at all.
Thus, existing vanishing line estimation techniques are inaccurate and unreliable.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. Embodiments of the present application provide a parameter estimation method, apparatus, electronic device, computer program product, and computer-readable storage medium, which enable a position coordinate of a vanishing line in a current frame image acquired by an imaging device to be estimated with higher accuracy, and meet a real-time vanishing line estimation requirement.
According to an aspect of the present application, there is provided a parameter estimation method, including: acquiring a current frame image and a previous frame image acquired by an imaging device; determining a mapping relationship between the current frame image and the previous frame image; and estimating the current position coordinate of the vanishing line in the current frame image according to the mapping relation and the previous position coordinate of the vanishing line in the previous frame image.
In one embodiment of the present application, determining the mapping relationship between the current frame image and the previous frame image comprises: sampling the previous frame image to obtain a pixel point set, wherein the position coordinate of each pixel point in the pixel point set in the previous frame image is a known position coordinate; determining the current position coordinates of each pixel point in the pixel point set in the current frame image; and determining the mapping relation according to the known position coordinates and the current position coordinates of each pixel point.
In one embodiment of the present application, sampling the previous frame image comprises: the previous frame image is sampled in a wide baseline region, which is a region where the ratio of object distance to focal distance is greater than or equal to a threshold.
In an embodiment of the present application, determining the current position coordinates of each pixel point in the pixel point set in the current frame image includes: tracking each pixel point in the pixel point set in the current frame image based on an optical flow algorithm; and determining the current position coordinates of each pixel point in the current frame image according to the tracking result.
In an embodiment of the present application, determining the mapping relationship according to the known position coordinates and the current position coordinates of each pixel point includes: the mapping relationship is determined by optimizing the fitting error using the known position coordinates and the current position coordinates of each pixel point.
In one embodiment of the present application, determining the mapping relationship between the current frame image and the previous frame image further comprises: aiming at each pixel point in the pixel point set, calculating the prediction error of the pixel point according to the mapping relation, the known position coordinate and the current position coordinate of the pixel point; sequencing prediction errors of all pixel points; and re-determining the mapping relationship by re-optimizing the fitting error using the known position coordinates and the current position coordinates of each of the predetermined number of pixel points having smaller errors.
In one embodiment of the present application, determining the mapping relationship between the current frame image and the previous frame image comprises: detecting a previous feature point set in the previous frame image and determining position coordinates and feature information of each feature point in the previous feature point set; detecting a current feature point set in the current frame image and determining position coordinates and feature information of each feature point in the current feature point set; determining matching pairs of feature points in the previous frame image and the current frame image according to the feature information; and determining the mapping relation according to the previous position coordinates of the matched feature point pairs in the previous frame image and the current position coordinates in the current frame image.
In one embodiment of the present application, detecting a set of previous feature points in the previous frame image comprises: the previous feature point set is detected in a wide baseline region of the previous frame image, the wide baseline region being a region where a ratio between an object distance and a focal distance is greater than or equal to a threshold.
In one embodiment of the present application, the parameter estimation method further includes: calculating a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line; and correcting the current position coordinate according to the position coordinate difference.
In one embodiment of the present application, correcting the current position coordinates according to the position coordinate difference includes: comparing the position coordinate difference to a threshold; in response to the position coordinate difference being greater than or equal to the threshold, directly determining the estimated current position coordinate as a modified current position coordinate; and determining a modified current position coordinate by a weight coefficient, the estimated current position coordinate, and a default position coordinate of the vanishing line in response to the position coordinate difference being less than the threshold.
In one embodiment of the present application, determining the modified current position coordinates by a weight coefficient, the estimated current position coordinates, and the default position coordinates of the vanishing line includes: determining the weight coefficient according to the position coordinate difference, wherein the larger the position coordinate difference is, the larger the proportion of the estimated current position coordinate in the corrected current position coordinate is, the smaller the position coordinate difference is, and the smaller the proportion of the estimated current position coordinate in the corrected current position coordinate is; and determining the corrected current position coordinate through the weight coefficient, the estimated current position coordinate and the default position coordinate of the vanishing line.
In one embodiment of the present application, the parameter estimation method further includes: calculating a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line; and calculating the variation of the pitch angle of the imaging device relative to the ground plane according to the position coordinate difference.
According to another aspect of the present application, there is provided a parameter estimation apparatus, including: the image acquisition unit is used for acquiring a current frame image and a previous frame image which are acquired by the imaging device; a relationship determination unit for determining a mapping relationship between the current frame image and the previous frame image; and a coordinate estimation unit for estimating the current position coordinate of the vanishing line in the current frame image according to the mapping relation and the previous position coordinate of the vanishing line in the previous frame image.
According to another aspect of the present application, there is provided an electronic device including: a processor; a memory; and computer program instructions stored in the memory, which when executed by the processor, cause the processor to perform the above-described parameter estimation method.
In one embodiment of the present application, the electronic device further includes: an imaging device configured to acquire the current frame image and the previous frame image.
According to another aspect of the application, a computer program product is provided, comprising computer program instructions which, when executed by a processor, cause the processor to perform the above-described parameter estimation method.
According to another aspect of the present application, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the above-described parameter estimation method.
Compared with the prior art, with the parameter estimation method, device, electronic device, computer program product and computer readable storage medium according to the embodiments of the present application, it is possible to receive an image frame sequence acquired by an imaging device, determine a mapping relationship between a current frame image and a previous frame image, and estimate a current position coordinate of a vanishing line in the current frame image according to the mapping relationship and a previous position coordinate of the vanishing line in the previous frame image. Therefore, the position coordinates of the vanishing line in the current image acquired by the imaging device can be estimated with higher precision, and the real-time vanishing line estimation requirement is met.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1A is a schematic diagram illustrating an image in which a vanishing line acquired by an imaging device according to an embodiment of the present application is located at a default position, and fig. 1B is a schematic diagram illustrating an image in which a vanishing line acquired by an imaging device according to an embodiment of the present application is deviated from a default position.
Fig. 2 illustrates a flow chart of a parameter estimation method according to a first embodiment of the present application.
Fig. 3 illustrates a flowchart of the step of determining a mapping relationship according to the first example of the embodiment of the present application.
FIG. 4 illustrates a schematic diagram of a wide baseline region in accordance with an embodiment of the present application.
Fig. 5 illustrates a flowchart of the step of determining a mapping relationship according to the second example of the embodiment of the present application.
Fig. 6 illustrates a flow chart of a parameter estimation method according to a second embodiment of the present application.
FIG. 7 illustrates a flowchart of the step of determining a mapping relationship according to an embodiment of the application.
Fig. 8 illustrates a block diagram of a parameter estimation apparatus according to an embodiment of the present application.
FIG. 9 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Summary of the application
As described above, there is no effective vanishing line estimation method that can estimate the position of a vanishing line in an image captured by an imaging device all the day because the vanishing line is easily blocked, and a lane line may be incomplete.
In view of this technical problem, the basic idea of the present application is to propose a new parameter estimation method, apparatus, electronic device, computer program product and computer readable storage medium, which may receive an image frame sequence acquired by an imaging device, determine a mapping relationship between a current frame image and a previous frame image, and estimate a current position coordinate of a vanishing line in the current frame image according to the mapping relationship and a previous position coordinate of the vanishing line in the previous frame image. Therefore, the embodiment of the application according to the basic concept can estimate the position coordinates of the vanishing line in the current frame image acquired by the imaging device with higher precision, and meets the real-time vanishing line estimation requirement.
Having described the general principles of the present application, various non-limiting embodiments of the present application will now be described with reference to the accompanying drawings.
Application scene overview
Embodiments of the present application may be applied to various scenarios. For example, embodiments of the present application may be used to estimate parameters of an imaging device equipped on a vehicle. For example, the vehicle may be of a different type, which may be a vehicle, aircraft, spacecraft, water vehicle, or the like. For convenience of explanation, the description will be continued with a vehicle as an example of the vehicle.
For example, one or more imaging devices are generally equipped on a vehicle for the purpose of driving assistance or the like. These imaging devices may be used to capture images of the path of travel, to prevent the vehicle from colliding with other objects on the roadway and to direct the vehicle to take the correct route, and so on. Due to manufacturing tolerances, each vehicle must perform independent end-of-line camera calibration or aftermarket camera adjustments to determine the tilt angle of the imaging device on the vehicle, to determine the default position of the vanishing line in the image captured by the imaging device, and finally for driving assistance purposes, etc.
Fig. 1A is a schematic diagram illustrating an image in which a vanishing line acquired by an imaging device according to an embodiment of the present application is located at a default position, and fig. 1B is a schematic diagram illustrating an image in which a vanishing line acquired by an imaging device according to an embodiment of the present application is deviated from a default position.
As shown in fig. 1A, after being aligned, the vanishing line should always be exactly at a default position (position shown by a dashed line in fig. 1A) in the image captured by the imaging device under normal conditions. However, since the vehicle body tilts forward and backward due to bumping during the movement of the vehicle, the pitch angle of the imaging device mounted on the vehicle body relative to the ground plane may change, and the vanishing line may deviate from the default position in the image captured by the imaging device. As shown in fig. 1B, when the vehicle is tilted up in the head and depressed down in the tail due to a shock, the true position of the vanishing line (the position shown by the two-dot chain line in fig. 1B) will move downward from the default position (the position shown by the dashed line in fig. 1B). Conversely, when the head of the vehicle is depressed downward and the tail of the vehicle is tilted upward, the true position of the vanishing line will move upward from the default position.
In this way, if the image captured by the imaging device is continuously analyzed and processed using the default position of the vanishing line in the case where the vehicle is inclined, it is inevitable that errors are caused in the results of the image analysis and processing, and if the driving-assist operation is then performed based on these error results, erroneous determination results may be caused on the road condition.
For example, assume that another vehicle exists 100 meters ahead of the current vehicle. In the case where the current vehicle is not bumpy, since the vanishing line is correctly located at the default position, it can be correctly recognized that the other vehicle is located 100 meters ahead of the current vehicle according to the results of the image analysis and the object recognition. However, once the head of the current vehicle tilts upward due to vibration, the true position of the vanishing line will move downward from the default position, and if the image analysis and object recognition are continued based on the default position, the other vehicle may be erroneously recognized as being located less than 100 meters in front of the current vehicle, which may cause subsequent erroneous driving assistance operations, cause the vehicle to unnecessarily take measures such as braking, warning, and the like, reduce user experience, and even cause traffic safety hazards.
For this reason, in the embodiment of the present application, the image frame sequence acquired by the imaging device itself may be used to estimate the position coordinates of the vanishing line in the current frame image acquired by the imaging device, and optionally further estimate the pitch angle of the imaging device relative to the ground plane according to the estimated position coordinates of the vanishing line, thereby reducing the influence on the sensing accuracy of various sensors equipped on the vehicle due to the pitching motion and the like.
Of course, although the embodiments of the present application have been described above by taking a vehicle as an example, the present application is not limited thereto. The embodiments of the present application can be applied to estimating parameters of an imaging device equipped on various electronic apparatuses. For example, the embodiments of the present application can be equally applied to a parameter estimation operation of an imaging device equipped on a movable robot, an imaging device provided at a fixed position for monitoring, and the like.
In the following, various embodiments according to the present application will be described with reference to the accompanying drawings in connection with the application scenarios of fig. 1A and 1B.
Exemplary method
Fig. 2 illustrates a flow chart of a parameter estimation method according to a first embodiment of the present application.
As shown in fig. 2, the parameter estimation method according to the first embodiment of the present application may include:
in step S110, a current frame image and a previous frame image acquired by the imaging device are acquired.
Hereinafter, for convenience of explanation, the description will be continued by taking the case where the imaging device is equipped on a vehicle as an example.
For example, the imaging device may be an image sensor for capturing a sequence of image frames, which may be a camera or an array of cameras. For example, the image data sequence acquired by the image sensor may be a continuous image frame sequence (i.e., a video stream) or a discrete image frame sequence (i.e., an image data set sampled at a predetermined sampling time point), etc. For example, the camera may be a monocular camera, a binocular camera, a multi-view camera, etc., and in addition, it may be used to capture a gray scale image, and may also capture a color image with color information. Of course, any other type of camera known in the art and that may appear in the future may be applied to the present application, and the present application has no particular limitation on the manner in which an image is captured as long as gray scale or color information of an input image can be obtained. To reduce the amount of computation in subsequent operations, in one embodiment, the color map may be grayed out before analysis and processing.
For example, the imaging device may continually capture image frames, which may be continually analyzed and processed, and stored in memory for subsequent processing.
For example, in the method, when the imaging device acquires the current frame image, the current frame image may be received, and simultaneously with or before or after the current frame image, the previous frame image of the current frame image may be read from the memory. For example, the previous frame image may be a previous frame image adjacent to the current frame image, or in a case where the image content does not change much, the previous frame image may be a frame image a few frames before the current frame image.
In step S120, a mapping relationship between the current frame image and the previous frame image is determined.
After acquiring the current frame image and the previous frame image acquired by the imaging device, the current frame image and the previous frame image may be analyzed and processed by various methods to calculate a mapping relationship therebetween.
Fig. 3 illustrates a flowchart of the step of determining a mapping relationship according to the first example of the embodiment of the present application.
As shown in fig. 3, step S120 may include:
in the sub-step S121, the previous frame image is sampled to obtain a pixel point set, and a position coordinate of each pixel point in the pixel point set in the previous frame image is a known position coordinate.
For example, when a current frame image is acquired, a previous frame image may be read from a memory and sampled to obtain a set of pixels at known locations in the previous frame image.
Since the position coordinates of the vanishing lines in the image are mainly focused in the embodiment of the present application, and the vanishing lines are always at a position very far from the camera, in order to reduce the calculation amount of data and improve the data processing accuracy, not the entire region of the image frame but only a specific region of interest (ROI) of the image frame may be sampled.
For example, the previous frame image may be sampled in a wide baseline region, which is a region where the ratio of object distance to focal distance is greater than or equal to a threshold. In other words, the wide baseline region may be a region where the object distance is much greater than the focal distance. In practice, the position and size of the wide baseline region may be determined empirically, manually.
FIG. 4 illustrates a schematic diagram of a wide baseline region in accordance with an embodiment of the present application.
As shown in fig. 4, assume that the resolution of an image captured by the imaging device is 640 × 480 pixels. For example, the wide baseline region may be empirically defined as a 300 x 100 pixel region located at the center of the image. Although the application is not so limited. In actual use, the position of the wide baseline region in the image needs to be properly selected according to the final boresight camera calibration or the aftermarket camera adjustments, and the different resolutions of the image.
In the first example of the embodiment of the present application, the wide baseline region may be sampled by various sampling methods. For example, a uniform sampling method may be used. Specifically, some grid points with uniform intervals at preset positions may be selected in the wide baseline region, for example, M pixel points are taken every line, and N lines are taken once every other line, where M and N are natural numbers. The advantage of uniform sampling is that the calculation amount can be reduced and the calculation speed can be increased when the mapping relation is determined subsequently. Of course, the present application is not limited thereto. For example, other methods such as random sampling may also be used to sample in the wide baseline region. Regardless of the sampling method, the position coordinates of each pixel in the sampled pixel set in the previous frame image may be known.
It should be noted that, although the above description has been made by taking an example in which the entire previous frame image is first stored, and then the operation of obtaining the pixel point sets is performed after the previous frame image is read from the memory, the present application is not limited thereto. For example, in order to reduce the storage amount, only the region of interest (e.g., the wide baseline region) in the previous frame image may be stored, or the previous frame image may be sampled in advance to obtain the pixel point set, and only the gray information and the position information of each pixel point in the pixel point set are stored in the memory, instead of storing the complete previous frame image. Therefore, when the current frame image is acquired, the gray information, the position information and the like of each pixel point can be directly read from the memory, and the execution speed of the vanishing line estimation operation in the current frame is improved.
In sub-step S122, the current position coordinates of each pixel point in the pixel point set in the current frame image are determined.
After a set of pixel points (e.g., grid points) has been selected in the wide baseline region of the previous frame, each of the pixel points in the current frame may be tracked to determine their current position coordinates in the current frame image.
In one embodiment, since the transformation between two frames is very small, the moving distance of each pixel point in the image is usually very limited, so to reduce the calculation amount, each pixel point can be tracked in the range of expanding the wide baseline region by a certain height and width in the current frame image.
For example, each pixel point in the set of pixel points may be tracked in the current frame image based on an optical flow algorithm, and the current position coordinate of each pixel point in the current frame image may be determined according to the tracking result.
Optical flow is a pattern of significant motion of objects, surfaces, and edges in a visual scene caused by relative motion between the observer (eye or camera) and the scene, which can be used to represent the correspondence between pixels in two frames of images before and after in time. The optical flow method is a method for estimating the moving speed and direction of an object by detecting the change of the intensity I of image pixels along with time. In general, assuming that the luminance of a pixel located at (x, y, t) (where x and y are horizontal and vertical coordinates of the pixel in an image and t is a time coordinate of the image, respectively) is I (x, y, t), based on the assumption that the luminance of the same pixel between two adjacent image frames is approximately the same, the following formula can be derived from taylor series:
I(x,y,t)=I(x+Δx,y+Δy,t+Δt)
neglecting the high-order term H.O.T., then there are
Figure BDA0001114417110000102
Or
Figure BDA0001114417110000103
Which can lead out
Figure BDA0001114417110000104
Wherein, VxAnd VyIs the velocity component in the x and y directions, i.e., the optical flow of I (x, y, t),
Figure BDA0001114417110000105
and
Figure BDA0001114417110000106
respectively the difference between two adjacent pixel points in the x and y directions in the image,
Figure BDA0001114417110000107
is the difference between two adjacent frames of images of the same pixel point. This equation has two unknowns VxAnd VyIt has not been solved, which is the aperture problem of the so-called optical flow algorithm. Then the optical flow vector (V) is foundx,Vy) Another solution is needed.
For example, the Lucas-Kanade method (LK method) can be used to solve the above problem. The LK method is a non-iterative method that assumes that the optical flow equation holds for all pixels in a small window centered around the point of interest p, i.e., the local image flow (velocity) vector (V)x,Vy) The following equation must be satisfied:
Figure BDA0001114417110000108
wherein q is1,q2,…qnIs a pixel point within a small window, and Ix(qi)、Iy(qi) And It(qi) Respectively at the current time at a pixel point qiThe estimated partial derivatives of the image with respect to the positions x, y and t.
It can be seen that there are only two unknowns in the above equation setBut there are more than two equations and the system is an overdetermined equation, that is, there is redundancy in the system. To solve this overdetermined problem, we use the least squares method to solve the optical flow information (V)x,Vy). Since the specific derivation and solving process is not the focus of the present invention, it is not repeated here.
In one embodiment, for example, when tracking each point in the set of pixel points using optical flow, a pyramid optical flow algorithm may be further used to track each pixel point one by one to further improve the accuracy and speed of the tracking algorithm.
It should be noted that, although the specific operation of pixel tracking is described above in the optical flow tracking algorithm based on the LK method, the present application is not limited thereto. For example, other optical flow tracking algorithms based on the Horn-Schunck method, the Buxton-Buxton method, and the like may also be used. Furthermore, in addition to the optical flow tracking algorithm, the tracking operation of the pixel points can be implemented by using any known or later developed algorithms, such as Camshift, particle filter algorithm, and the like.
After calculating the optical flow vector (V)x,Vy) Then, the pixel point (x) in the previous frame image can be obtained by the following formula0,y0) Position coordinates (x) in the current frame image1,y1):
x1=x0+Vx
y1=y0+Vy
In sub-step S123, the mapping relationship is determined according to the known position coordinates and the current position coordinates of each pixel point.
For example, the mapping relationship may be determined by optimizing the fitting error using the known position coordinates and the current position coordinates of each pixel point.
Since the attention area is an area where the vanishing line is located (i.e., a wide baseline area) and the object distance is far greater than the focal distance, a homography matrix can be used to fit the mapping relationship (or referred to as a transformation relationship) between the previous frame and the next frame. In computer vision, the homography of a plane is defined as the projection mapping of one plane to another. Thus, the mapping of points on a two-dimensional plane onto the camera imager is an example of a planar homography. The homography matrix (or transform matrix) is a 3 x 3 matrix, nominally H.
Specifically, the optimal transformation matrix H may be obtained by optimizing a fitting error between the previous frame image and the current frame image, where the fitting error err is defined as follows:
wherein the content of the first and second substances,is the coordinate (x) of the pixel point in the previous frame image0,y0) And is of homogeneous order, and
Figure BDA0001114417110000113
Figure BDA0001114417110000114
is the coordinate (x) of the pixel point in the current frame image1,y1) And is of homogeneous order, and
Figure BDA0001114417110000115
and solving the minimum value of the fitting error err through the position coordinates of all sampling points in the previous frame image and the tracking position coordinates of the sampling points in the current frame image to obtain a transformation matrix H serving as an optimal solution for continuously estimating the current position coordinates of the vanishing line in the current frame image in the subsequent operation.
However, since there is a noise problem in optical flow tracking, i.e., there may be erroneous tracking results in the optical flow tracking results, in one embodiment, a more robust transformation matrix H calculation method may be employed. For example, after the mapping relationship is determined by optimizing the fitting error using the known position coordinates and the current position coordinates of each pixel point, the mapping relationship may be further corrected.
In one embodiment, the mapping may be modified by: aiming at each pixel point in the pixel point set, calculating the prediction error of the pixel point according to the mapping relation, the known position coordinate and the current position coordinate of the pixel point; sequencing prediction errors of all pixel points; and re-determining the mapping relationship by re-optimizing the fitting error using the known position coordinates and the current position coordinates of each of the predetermined number of pixel points having smaller errors.
Specifically, the transformation matrix H may first be estimated using the full valid optical flow tracking results to optimize the fitting error err between the previous frame image and the current frame image, as described above. Then, after the transformed mean H is obtained, a prediction error may be calculated for each point. The formula for calculating the prediction error is the same as the defined formula (1) for the fitting error err. Next, the prediction errors calculated for all points may be sorted. Finally, a certain number of samples (e.g., 30%) with the smallest error may be selected as the refitted samples to refit the transformation matrix H to obtain the transformation matrix H between the two previous and next frames of images.
It should be noted that, although the first example of the embodiment of the present application has been described as an example in which pixel sampling is performed on the previous frame image, the current position coordinate of each sampled pixel in the current frame image is determined, and the mapping relationship is determined according to the known position coordinate and the current position coordinate of the pixel, the present application is not limited thereto.
Fig. 5 illustrates a flowchart of the step of determining a mapping relationship according to the second example of the embodiment of the present application.
As shown in fig. 5, step S120 may include:
in sub-step S124, a previous feature point set is detected in the previous frame image and the position coordinates and feature information of each feature point in the previous feature point set are determined.
Similarly, in sub-step S121, in order to reduce the amount of calculation of data and improve the data processing accuracy, feature point detection may be performed on the previous frame image in a wide baseline region, and position information and feature information of each feature point may be obtained. For example, the position information of each feature point may be the position coordinates of the feature point in the image frame, and the feature information of the feature point may be a feature descriptor of the feature point, or referred to as a feature descriptor or a feature description vector, or simply referred to as a feature description.
For example, various feature point detection algorithms may be used to detect and characterize feature points in a wide baseline region of a previous frame image. In general, feature point extraction is to extract features that remain unchanged in different types of images, such as edge points, corner points, interest points, and centers of closed regions. The feature points are used as registration control points, and the number of the feature points is required to be sufficient and the feature points are required to be distributed uniformly. Currently, commonly used feature point extraction methods include Harris (Harris) algorithm, Scale Invariant Feature Transform (SIFT) algorithm, Speeded Up Robust Feature (SURF) algorithm, and the like.
The Harris algorithm uses a new corner point determination method. Two eigenvectors λ of matrix M1And λ2Proportional to the principal curvature of the matrix M. Harris uses lambda1And λ2To characterize the two directions of fastest and slowest change. If both are large, the pixel point is the angular point; if one is large and one is small, the pixel point is the point on the edge; if both are small, the pixel point is in the image area with slow change.
In the aspect of feature point detection, the SIFT algorithm continuously filters and downsamples an input image through gaussian kernel functions of different scales to form a gaussian pyramid image, and then subtracts the gaussian pyramid images of adjacent scales to obtain a difference of gaussian functions (DoG). And comparing each point in the DoG scale space with the adjacent scale and the point at the adjacent position one by one to obtain the position of the local extreme value, namely the position of the characteristic point and the corresponding scale.
The SURF algorithm takes the thought of simplified approximation in SIFT as a reference, approximately simplifies a Gaussian second-order differential template in a Hessian determinant (DoH), so that the filtering of the image by the template only needs to perform a few simple addition and subtraction operations, and the operations are related to the size of the filtering template. The algorithm carries out fast calculation on the basis of the integral image, and the characteristic points are detected through a fast Hessian detector.
It should be noted that, although several specific feature point detection algorithms are listed above, the present application is not limited thereto. The feature point detection algorithm, whether existing or developed in the future, may be applied to the parameter estimation method according to the embodiment of the present application, and should also be included in the scope of protection of the present application.
After the feature points are detected and described by using a specific feature descriptor to obtain feature information thereof, position information of each feature point, i.e., coordinates of the x-axis and the y-axis, may also be determined in the current frame image.
In sub-step S125, a current feature point set is detected in the current frame image and position coordinates and feature information of each feature point in the current feature point set are determined.
Similarly, the same feature point detection algorithm as in sub-step S124 may be employed to detect and characterize feature points in the current frame image. In an embodiment, since the transformation between two frames is very small, the moving distance of each pixel point in the image is usually very limited, so that in order to reduce the calculation amount in the subsequent feature point matching process, the wide baseline region can be expanded by a certain height and width to track each pixel point in the current frame image.
In sub-step S126, matching pairs of feature points in the previous frame image and the current frame image are determined according to the feature information.
For example, the feature point similarity between two frame images may be calculated from the description information of the feature points detected in the previous frame image and the current frame image, respectively. The similarity may be taken as a distance measure fdis (f)i,fj) Including but not limited to euclidean distance. Here, fiIs the feature description of the ith feature point in the previous frame imageInformation, fjIs the feature description information of the jth feature point in the previous frame image. Then, a pair of feature points whose similarity is greater than or equal to a predetermined threshold is determined as the matching feature point pair. I.e. the distance measurement fdis (f)i,fj) The smaller the number, the higher the similarity between pairs of feature points can be represented. Next, the most similar feature point in the current frame image may be selected for each feature point in the previous frame image.
In sub-step S127, the mapping relationship is determined according to the previous position coordinates of the matching feature point pairs in the previous frame image and the current position coordinates in the current frame image.
For example, the mapping relationship may be determined by optimizing the fitting error using the previous position coordinates and the current position coordinates of each feature point.
Since sub-step S127 is substantially identical to sub-step S123, and a more robust transformation matrix H calculation method can also be employed, a similar description thereof is omitted here.
In step S130, the current position coordinate of the vanishing line in the current frame image is estimated according to the mapping relation and the previous position coordinate of the vanishing line in the previous frame image.
After determining a transformation matrix H between the current frame image and the previous frame image, a vanishing line of the current frame is predicted using the transformation matrix. That is, the relationship between the coordinate cur _ y of the vanishing line in the current frame on the y-axis and the coordinate last _ y of the vanishing line in the previous frame on the y-axis is:
cur_y=last_y×H。
therefore, the position coordinates of the vanishing line in the current frame are successfully estimated for subsequently determining the pitch angle of the imaging device on the vehicle, so as to determine the default position of the vanishing line in the image captured by the imaging device, and finally the vanishing line is used for the purposes of driving assistance and the like.
In one embodiment, the parameter estimation method according to the embodiment of the present application may be cyclically performed to estimate the position coordinates of the vanishing lines in each frame of image captured by the imaging device in real time. For example, after determining the current position coordinates of the vanishing line in the current frame image, the current frame image may be stored in a memory as a previous frame image of a next frame image, thereby estimating the position coordinates of the vanishing line in the next frame image. Of course, as described above, in order to save the storage space, instead of storing the whole image, the current frame image may be partially truncated, sampled by a pixel point, or extracted by a feature point, and the determined position information of the pixel point (including the sampling point and the feature point) and other related information (e.g., the feature descriptor of the feature point) are stored in the memory for subsequent processing.
It can be seen that, with the parameter estimation method according to the first embodiment of the present application, it is possible to receive an image frame sequence acquired by an imaging device, determine a mapping relationship between a current frame image and a previous frame image, and estimate a current position coordinate of a vanishing line in the current frame image according to the mapping relationship and a previous position coordinate of the vanishing line in the previous frame image. Therefore, in the embodiment, the position coordinates of the vanishing line in the current frame image acquired by the imaging device can be estimated with higher precision, the real-time vanishing line estimation requirement is met, the principle is simple, the calculated amount is small, the accuracy is high, and the method has strong adaptability to different scenes.
It is found through experiments that, as shown in fig. 1B, the estimated position of the vanishing line in the current frame image (the position shown by the one-dot chain line in fig. 1B) obtained by the parameter estimation method according to the first embodiment of the present application is closer to the true position of the vanishing line (the position shown by the two-dot chain line in fig. 1B) than the default position (the position shown by the one-dot chain line in fig. 1B). Therefore, the situation that the default position of the vanishing line is indiscriminately adopted to cause errors in image analysis and processing results when the vehicle bumps is relieved is alleviated.
In the first embodiment, the current position coordinates of the vanishing line in the current frame image are always estimated in the current frame image solely depending on the mapping relationship between the current frame image and the previous frame image, and the previous position coordinates of the vanishing line in the previous frame image. However, once an error occurs in an estimation operation in a certain frame in an image frame sequence due to various reasons, the error is necessarily transmitted to a subsequent estimation operation, and if the error is persistent for a long time, the error is continuously accumulated and expanded, so that a subsequent estimation result is erroneous.
In order to solve the above problem, in a second embodiment of the present application, it is proposed: the current position coordinates of the vanishing line in the current frame image may be weighted and corrected after the current position coordinates of the vanishing line in the current frame image are estimated.
Fig. 6 illustrates a flow chart of a parameter estimation method according to a second embodiment of the present application.
In fig. 6, the same reference numerals are used to indicate the same steps as in fig. 2. Thus, steps S110-S130 in FIG. 6 are the same as steps S110-S130 of FIG. 2, and reference may be made to the description above in connection with FIGS. 2 through 5. Fig. 6 differs from fig. 2 in that step S140 is added.
In step S140, a current position coordinate of the vanishing line in the current frame image is weighted and corrected.
Further research has found that, when the vehicle runs smoothly under a less bumpy condition, the true position of the vanishing line tends to be much closer to the default position determined after calibration by the final inspection line camera or adjustment by the aftermarket camera, and when the vehicle runs unstably under a more bumpy condition, the true position of the vanishing line tends to be much closer to the estimated position estimated according to the above steps S110-S130. Therefore, the current driving state of the vehicle can be judged, and the current position coordinates of the vanishing line in the current frame image are weighted and corrected according to the driving state so as to prevent errors from being accumulated continuously in the subsequent estimation operation.
FIG. 7 illustrates a flowchart of the step of determining a mapping relationship according to an embodiment of the application.
As shown in fig. 7, step S140 may include:
in sub-step S141, a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line is calculated.
For example, the transformation difference diff of the vanishing line between the previous frame and the current frame may be calculated as follows:
diff=cur_y-last_y。
wherein the transformation difference diff is a change in position of the vanishing line in the y-axis direction, and when its value is small, it can be considered that the vehicle is in a state of small jerk; and when its value is large, the vehicle can be considered to be in a highly bumpy condition.
In sub-step S142, the current position coordinates are corrected according to the position coordinate difference.
For example, the position coordinate difference may be compared to a threshold; in response to the position coordinate difference being greater than or equal to the threshold, directly determining the estimated current position coordinate as a modified current position coordinate; and in response to the position coordinate difference being less than the threshold, determining a modified current position coordinate by a weighting factor, the estimated current position coordinate, and a default position coordinate of the vanishing line.
For example, the weighting system may be a fixed value or a value that varies depending on the value of the position coordinate difference.
In the latter case, this correction operation may be referred to as spring correction. In this case, for example, determining the corrected current position coordinates by a weight coefficient, the estimated current position coordinates, and the default position coordinates of the vanishing line may include: determining the weight coefficient according to the position coordinate difference, wherein the larger the position coordinate difference is, the larger the proportion of the estimated current position coordinate in the corrected current position coordinate is, the smaller the position coordinate difference is, and the smaller the proportion of the estimated current position coordinate in the corrected current position coordinate is; and determining a corrected current position coordinate by the weight coefficient, the estimated current position coordinate, and a default position coordinate of the vanishing line.
Specifically, when the value of diff is greater than a threshold value, it indicates that the vehicle is currently in a non-steady running state with large jerks, and when the value of diff is less than the threshold value, it indicates that the vehicle is currently in a steady running state with small jerks. For example, when the transformation difference diff between two frames is large (for example, greater than or equal to 20 pixels), then the current vanishing line change is considered to be due to a severe body pitch angle change, and therefore, the estimate is more trusted; on the other hand, when the conversion difference diff between two frames is small (for example, less than 20 pixels), it is considered that the current change of the vanishing line is not caused by a drastic change of the pitch angle of the vehicle body, and therefore, the vanishing line is pulled to a default value, and the weight can be set according to the change degree of two frames before and after, and the smaller the change is, the larger the weight of the default value is, and finally last _ y is equal to cur _ y. That is, the current position coordinate cur _ y of the modified vanishing line can be expressed by the following formula:
cur_y=(last_y+diff)×α+default_y×(1-α)。
where α is the correction weight and default _ y is the default coordinate on the y-axis determined after factory calibration and adjustment of the vanishing line, in the case of no spring correction, the correction weight α may for example be set to 0.9 in the case of spring correction, the closer the value of diff is to 20, the closer the correction weight α may for example be to 1, and the closer the value of diff is to 0, the closer the correction weight α may for example be to 0.9.
It can be seen that, with the parameter estimation method according to the second embodiment of the present application, after estimating the current position coordinate of a vanishing line in a current frame image according to the mapping relationship between the current frame image and a previous frame image and the previous position coordinate of the vanishing line in the previous frame image, the estimation result may be further subjected to weighted correction according to the position coordinate difference of the vanishing line between the previous frame image and the current frame image, the default position coordinate of the vanishing line, and a weight coefficient. Therefore, in the present embodiment, the estimation accuracy of the vanishing line can be further improved, and the error is prevented from being accumulated and expanded in the subsequent estimation operation, so that the subsequent estimation result is erroneous.
Experiments show that the parameter estimation method according to the second embodiment of the present application can obtain a more accurate estimated position of the vanishing line in the current frame image than the first embodiment, and the estimated position of the vanishing line obtained by the parameter estimation method according to the first embodiment of the present application is closer to the real position of the vanishing line.
Further, in another embodiment of the present application, after step S130 or step S140, that is, after once calculating the current position coordinates of the vanishing line in the current frame image, optionally, the amount of change in the pitch angle of the imaging device with respect to the ground plane may be determined according to the previous position coordinates of the vanishing line in the previous frame image and the current position coordinates of the vanishing line in the current frame image.
Specifically, since there is a linear correspondence between the amount of change in the coordinate of a vanishing line and the amount of change in the pitch angle of the imaging device, the position coordinate difference between the current position coordinate and the previous position coordinate of the vanishing line can be calculated; and calculating the variation of the pitch angle of the imaging device relative to the ground plane according to the position coordinate difference.
Then, the value of the current pitch angle of the imaging device can also be found based on the value of the previous pitch angle and the amount of change.
Furthermore, since all sensors disposed on the vehicle are generally in rigid connection with the vehicle, once the variation of the pitch angle of the imaging device relative to the ground plane is known, the pitch angles of all other vehicle-mounted sensors except the imaging device can be corrected according to the variation, so that the influence of the vehicle bump on the sensing accuracy of various sensors can be comprehensively reduced.
Exemplary devices
Next, a parameter estimation apparatus according to an embodiment of the present application is described with reference to fig. 8.
Fig. 8 illustrates a block diagram of a parameter estimation apparatus according to an embodiment of the present application.
As shown in fig. 8, in one embodiment, the parameter estimation apparatus 100 may include: an image obtaining unit 110, configured to obtain a current frame image and a previous frame image collected by an imaging device; a relation determining unit 120 for determining a mapping relation between the current frame image and the previous frame image; and a coordinate estimation unit 130 for estimating a current position coordinate of the vanishing line in the current frame image according to the mapping relationship and a previous position coordinate of the vanishing line in the previous frame image.
In one example, the relationship determining unit 120 may include: the image sampling module is used for sampling the previous frame image to obtain a pixel point set, and the position coordinate of each pixel point in the pixel point set in the previous frame image is a known position coordinate; a coordinate determination module, configured to determine a current position coordinate of each pixel in the pixel set in the current frame image; and the relation determining module is used for determining the mapping relation according to the known position coordinate and the current position coordinate of each pixel point.
In one example, the image sampling module may sample the previous frame image in a wide baseline region, which is a region where a ratio of object distance to focal distance is greater than or equal to a threshold.
In one example, the coordinate determination module may track each pixel in the set of pixels in the current frame image based on an optical flow algorithm; and determining the current position coordinates of each pixel point in the current frame image according to the tracking result.
In one example, the relationship determination module may determine the mapping relationship by optimizing a fitting error using the known location coordinates and the current location coordinates of each pixel point.
In one example, the relationship determining unit 120 may further include: and the relationship correction module is used for correcting the mapping relationship.
In one example, the relationship correction module may calculate, for each pixel in the set of pixels, a prediction error of the pixel according to the mapping relationship and the known position coordinates and the current position coordinates of the pixel; sequencing prediction errors of all pixel points; and re-determining the mapping relationship by re-optimizing the fitting error using the known position coordinates and the current position coordinates of each of the predetermined number of pixel points having smaller errors.
In one example, the relationship determining unit 120 may include: a first detection module for detecting a previous feature point set in the previous frame image and determining a position coordinate and feature information of each feature point in the previous feature point set; the second detection module is used for detecting a current feature point set in the current frame image and determining the position coordinates and feature information of each feature point in the current feature point set; a feature matching module for determining matching feature point pairs in the previous frame image and the current frame image according to the feature information; and a relation determination module for determining the mapping relation according to the previous position coordinates of the matching feature point pairs in the previous frame image and the current position coordinates in the current frame image.
In one example, the first detection module may detect the previous set of feature points in a wide baseline region of the previous frame image, the wide baseline region being a region where a ratio between an object distance and a focal distance is greater than or equal to a threshold.
In another embodiment, the parameter estimation apparatus 100 may further include: and a coordinate correction unit 140, configured to perform weighted correction on the current position coordinates of the vanishing line in the current frame image.
In one example, the coordinate correcting unit 140 may include: a coordinate difference calculation module for calculating a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line; and the coordinate correction module is used for correcting the current position coordinate according to the position coordinate difference.
In one example, the coordinate modification module may compare the position coordinate difference to a threshold; in response to the position coordinate difference being greater than or equal to the threshold, directly determining the estimated current position coordinate as a modified current position coordinate; and determining a modified current position coordinate by a weight coefficient, the estimated current position coordinate, and a default position coordinate of the vanishing line in response to the position coordinate difference being less than the threshold.
In one example, the coordinate correction module may determine the weight coefficient according to the position coordinate difference, the larger the position coordinate difference is, the larger the weight coefficient is, the larger the proportion of the estimated current position coordinate in the corrected current position coordinate is, the smaller the position coordinate difference is, and the smaller the proportion of the estimated current position coordinate in the corrected current position coordinate is; and determining a corrected current position coordinate by the weight coefficient, the estimated current position coordinate, and a default position coordinate of the vanishing line.
In yet another embodiment, the parameter estimation apparatus 100 may further include: a pitch change calculation unit for determining a change amount of a pitch angle of the imaging device with respect to a ground plane based on a previous position coordinate of the vanishing line in the previous frame image and a current position coordinate of the vanishing line in the current frame image.
In one example, the pitch change calculation unit may calculate a position coordinate difference between the current position coordinate and the previous position coordinate of the vanishing line; and calculating the variation of the pitch angle of the imaging device relative to the ground plane according to the position coordinate difference.
The specific functions and operations of the respective units and modules in the above-described parameter estimation apparatus 100 have been described in detail in the parameter estimation method described above with reference to fig. 1A to 7, and therefore, a repetitive description thereof will be omitted.
As described above, the embodiments of the present application can be applied to electronic devices such as vehicles, mobile robots, monitoring facilities, and the like, on which imaging devices are equipped.
Accordingly, the parameter estimation apparatus 100 according to the embodiment of the present application may be integrated into the electronic device as a software module and/or a hardware module, in other words, the electronic device may include the parameter estimation apparatus 100. For example, the parameter estimation apparatus 100 may be a software module in an operating system of the electronic device, or may be an application developed for the electronic device; of course, the parameter estimation apparatus 100 can also be one of many hardware modules of the electronic device.
Alternatively, in another example, the parameter estimation apparatus 100 and the electronic device may be separate devices (e.g., a server), and the parameter estimation apparatus 100 may be connected to the electronic device through a wired and/or wireless network and transmit the interaction information according to an agreed data format.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 9. The electronic device may be a computer or server or other device such as a vehicle, mobile robot, monitoring facility, etc., on which the imaging device is equipped.
FIG. 9 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 9, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 11 to implement the parameter estimation methods of the various embodiments of the present application described above and/or other desired functions. Information such as a previous frame image, a current frame image, position coordinates of pixel points in an image, a feature description, and the like may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown). It should be noted that the components and configuration of electronic device 10 shown in FIG. 9 are exemplary only, and not limiting, and that electronic device 10 may have other components and configurations as desired.
For example, the input device 13 may be an imaging device for acquiring a sequence of image frames, which may be stored in the memory 12 for use by other components. Of course, other integrated or discrete imaging devices may be utilized to acquire the sequence of image frames and transmit it to the electronic device 10. The input device 13 may also include, for example, a keyboard, a mouse, and a communication network and a remote input device connected thereto.
The output device 14 may output various information to the outside (e.g., a user), including the current position coordinates of the estimated vanishing line in the current frame image, the estimation result such as the amount of change in the pitch angle of the imaging device with respect to the ground plane, and the like. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 9, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the parameter estimation method according to various embodiments of the present application described in the "exemplary methods" section of this specification, supra.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in a parameter estimation method according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (13)

1. A method of parameter estimation, comprising:
acquiring a current frame image and a previous frame image acquired by an imaging device;
determining a mapping relationship between the current frame image and the previous frame image;
estimating the current position coordinate of the vanishing line in the current frame image according to the mapping relation and the previous position coordinate of the vanishing line in the previous frame image;
calculating a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line;
comparing the position coordinate difference to a threshold;
in response to the position coordinate difference being greater than or equal to the threshold, directly determining the estimated current position coordinate as a modified current position coordinate; and
in response to the position coordinate difference being less than the threshold, determining a modified current position coordinate by a weighting factor, the estimated current position coordinate, and a default position coordinate of the vanishing line.
2. The method of claim 1, wherein determining the mapping relationship between the current frame image and the previous frame image comprises:
sampling the previous frame image to obtain a pixel point set, wherein the position coordinate of each pixel point in the pixel point set in the previous frame image is a known position coordinate;
determining the current position coordinates of each pixel point in the pixel point set in the current frame image; and
and determining the mapping relation according to the known position coordinate and the current position coordinate of each pixel point.
3. The method of claim 2, wherein sampling the previous frame image comprises:
the previous frame image is sampled in a wide baseline region, which is a region where the ratio of object distance to focal distance is greater than or equal to a threshold.
4. The method of claim 2, wherein determining the current position coordinates of each pixel in the set of pixels in the current frame image comprises:
tracking each pixel point in the pixel point set in the current frame image based on an optical flow algorithm; and
and determining the current position coordinates of each pixel point in the current frame image according to the tracking result.
5. The method of claim 2, wherein determining the mapping relationship based on the known location coordinates and the current location coordinates of each pixel point comprises:
the mapping relationship is determined by optimizing the fitting error using the known position coordinates and the current position coordinates of each pixel point.
6. The method of claim 2, wherein determining the mapping relationship between the current frame image and the previous frame image further comprises:
aiming at each pixel point in the pixel point set, calculating the prediction error of the pixel point according to the mapping relation, the known position coordinate and the current position coordinate of the pixel point;
sequencing prediction errors of all pixel points; and
and re-determining the mapping relation by re-optimizing the fitting error by using the known position coordinates and the current position coordinates of each pixel point in the preset number of pixel points with smaller errors.
7. The method of claim 1, wherein determining the mapping relationship between the current frame image and the previous frame image comprises:
detecting a previous feature point set in the previous frame image and determining position coordinates and feature information of each feature point in the previous feature point set;
detecting a current feature point set in the current frame image and determining position coordinates and feature information of each feature point in the current feature point set;
determining matching pairs of feature points in the previous frame image and the current frame image according to the feature information; and
determining the mapping relation according to the previous position coordinates of the matching feature point pairs in the previous frame image and the current position coordinates in the current frame image.
8. The method of claim 7, wherein detecting a previous set of feature points in the previous frame image comprises:
the previous feature point set is detected in a wide baseline region of the previous frame image, the wide baseline region being a region where a ratio between an object distance and a focal distance is greater than or equal to a threshold.
9. The method of claim 1, wherein determining the modified current location coordinates from a weighting factor, the estimated current location coordinates, and the default location coordinates of the vanishing line comprises:
determining the weight coefficient according to the position coordinate difference, wherein the larger the position coordinate difference is, the larger the proportion of the estimated current position coordinate in the corrected current position coordinate is, the smaller the position coordinate difference is, and the smaller the proportion of the estimated current position coordinate in the corrected current position coordinate is; and
and determining the corrected current position coordinate through the weight coefficient, the estimated current position coordinate and the default position coordinate of the vanishing line.
10. The method of claim 1, further comprising:
and calculating the variation of the pitch angle of the imaging device relative to the ground plane according to the position coordinate difference.
11. A parameter estimation apparatus, comprising:
the image acquisition unit is used for acquiring a current frame image and a previous frame image which are acquired by the imaging device;
a relationship determination unit for determining a mapping relationship between the current frame image and the previous frame image;
a coordinate estimation unit for estimating a current position coordinate of a vanishing line in the current frame image according to the mapping relation and a previous position coordinate of the vanishing line in the previous frame image; and the number of the first and second groups,
a coordinate correction unit comprising:
a coordinate difference calculation module for calculating a position coordinate difference between the current position coordinates and the previous position coordinates of the vanishing line;
the coordinate correction module is used for comparing the position coordinate difference with a threshold value; in response to the position coordinate difference being greater than or equal to the threshold, directly determining the estimated current position coordinate as a modified current position coordinate; and determining a modified current position coordinate by a weight coefficient, the estimated current position coordinate, and a default position coordinate of the vanishing line in response to the position coordinate difference being less than the threshold.
12. An electronic device, comprising:
a processor;
a memory; and
computer program instructions stored in the memory, which, when executed by the processor, cause the processor to perform the method of any of claims 1-10.
13. A computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1-10.
CN201610824392.2A 2016-09-14 2016-09-14 Parameter estimation method and device and electronic equipment Active CN106447730B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610824392.2A CN106447730B (en) 2016-09-14 2016-09-14 Parameter estimation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610824392.2A CN106447730B (en) 2016-09-14 2016-09-14 Parameter estimation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN106447730A CN106447730A (en) 2017-02-22
CN106447730B true CN106447730B (en) 2020-02-28

Family

ID=58168912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610824392.2A Active CN106447730B (en) 2016-09-14 2016-09-14 Parameter estimation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN106447730B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107607205A (en) * 2017-09-30 2018-01-19 江苏西格数据科技有限公司 Wire harness color sequences detecting system and method
CN109145860B (en) * 2018-09-04 2019-12-13 百度在线网络技术(北京)有限公司 lane line tracking method and device
CN109410284A (en) * 2018-10-31 2019-03-01 百度在线网络技术(北京)有限公司 A kind of method for parameter estimation, device, electronic equipment, vehicle and storage medium
CN109544629B (en) * 2018-11-29 2021-03-23 南京人工智能高等研究院有限公司 Camera position and posture determining method and device and electronic equipment
CN110944212A (en) * 2019-11-29 2020-03-31 合肥图鸭信息科技有限公司 Video frame reconstruction method and device and terminal equipment
CN113119978A (en) * 2021-05-10 2021-07-16 蔚来汽车科技(安徽)有限公司 Lane edge extraction method and apparatus, automatic driving system, vehicle, and storage medium
CN113942522A (en) * 2021-05-31 2022-01-18 重庆工程职业技术学院 Intelligent driving safety protection system
CN113449659B (en) * 2021-07-05 2024-04-23 淮阴工学院 Lane line detection method
CN113903014B (en) * 2021-12-07 2022-05-17 智道网联科技(北京)有限公司 Lane line prediction method and device, electronic device and computer-readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102456226A (en) * 2010-10-22 2012-05-16 财团法人工业技术研究院 Region-of-interest tracking system, method and computer program product
CN103455983A (en) * 2013-08-30 2013-12-18 深圳市川大智胜科技发展有限公司 Image disturbance eliminating method in embedded type video system
CN103841297A (en) * 2012-11-23 2014-06-04 中国航天科工集团第三研究院第八三五七研究所 Electronic image-stabilizing method suitable for resultant-motion camera shooting carrier
CN104408460A (en) * 2014-09-17 2015-03-11 电子科技大学 A lane line detecting and tracking and detecting method
CN104463859A (en) * 2014-11-28 2015-03-25 中国航天时代电子公司 Real-time video stitching method based on specified tracking points

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102456226A (en) * 2010-10-22 2012-05-16 财团法人工业技术研究院 Region-of-interest tracking system, method and computer program product
CN103841297A (en) * 2012-11-23 2014-06-04 中国航天科工集团第三研究院第八三五七研究所 Electronic image-stabilizing method suitable for resultant-motion camera shooting carrier
CN103455983A (en) * 2013-08-30 2013-12-18 深圳市川大智胜科技发展有限公司 Image disturbance eliminating method in embedded type video system
CN104408460A (en) * 2014-09-17 2015-03-11 电子科技大学 A lane line detecting and tracking and detecting method
CN104463859A (en) * 2014-11-28 2015-03-25 中国航天时代电子公司 Real-time video stitching method based on specified tracking points

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于SURF的红外成像末制导目标跟踪算法;辛明,张苗辉;《光电子 激光》;20120815;第23卷(第8期);第1600页第4节 *
楼道消失点查找与跟踪的迭代算法;张强,王恒升;《计算机应用研究》;20130910;第31卷(第3期);735-738 *

Also Published As

Publication number Publication date
CN106447730A (en) 2017-02-22

Similar Documents

Publication Publication Date Title
CN106447730B (en) Parameter estimation method and device and electronic equipment
CN106485233B (en) Method and device for detecting travelable area and electronic equipment
US10169667B2 (en) External environment recognizing device for vehicle and vehicle behavior control device
CN107567412B (en) Object position measurement using vehicle motion data with automotive camera
US9547795B2 (en) Image processing method for detecting objects using relative motion
JP5689907B2 (en) Method for improving the detection of a moving object in a vehicle
JP4052650B2 (en) Obstacle detection device, method and program
JP2020064046A (en) Vehicle position determining method and vehicle position determining device
US10762656B2 (en) Information processing device, imaging device, apparatus control system, information processing method, and computer program product
CN107121132B (en) Method and device for obtaining vehicle environment image and method for identifying object in environment
WO2016178335A1 (en) Lane detection apparatus and lane detection method
JP6708730B2 (en) Mobile
US9098750B2 (en) Gradient estimation apparatus, gradient estimation method, and gradient estimation program
EP3403216A1 (en) Systems and methods for augmenting upright object detection
EP3282389B1 (en) Image processing apparatus, image capturing apparatus, moving body apparatus control system, image processing method, and program
JP6032034B2 (en) Object detection device
US11151729B2 (en) Mobile entity position estimation device and position estimation method
US10235579B2 (en) Vanishing point correction apparatus and method
JP5539250B2 (en) Approaching object detection device and approaching object detection method
JP4502733B2 (en) Obstacle measuring method and obstacle measuring device
WO2010044127A1 (en) Device for detecting height of obstacle outside vehicle
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program
KR100751096B1 (en) Velocity measuring apparatus and method using optical flow
KR101912085B1 (en) Reliability calaulation method of lane detection and Apparatus for calcaulating for performing thereof
JP2010223619A (en) Calibration apparatus and calibration method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant