CN114119760A - Motor vehicle positioning method and device, electronic device and storage medium - Google Patents
Motor vehicle positioning method and device, electronic device and storage medium Download PDFInfo
- Publication number
- CN114119760A CN114119760A CN202210107324.XA CN202210107324A CN114119760A CN 114119760 A CN114119760 A CN 114119760A CN 202210107324 A CN202210107324 A CN 202210107324A CN 114119760 A CN114119760 A CN 114119760A
- Authority
- CN
- China
- Prior art keywords
- camera
- positioning aid
- positioning
- distance
- motor vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Navigation (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a motor vehicle positioning method and device, an electronic device and a storage medium. The motor vehicle positioning method is used for a motor vehicle, the motor vehicle is provided with a camera, and the method comprises the following steps: determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, the positioning aid having a certain length; obtaining a photograph taken at a current location, the photograph containing the positioning aid; projecting the positioning aids onto two-dimensional imaging planes of cameras of the vehicles at the estimated positions respectively; calculating the matching degree of the positioning help object projected on the two-dimensional imaging plane of the camera of each virtual position vehicle and the positioning help object in the picture taken by the camera; and setting the estimated current position with the optimal matching degree as the current position.
Description
Technical Field
The invention relates to motor vehicle positioning.
Background
It is difficult to perform high-precision positioning of a vehicle in an environment without a global positioning signal (e.g., an underground parking lot).
Disclosure of Invention
The present invention has been made keeping in mind the above problems occurring in the prior art, and is intended to solve one or more of the problems occurring in the prior art.
According to an aspect of the present invention, there is provided a motor vehicle positioning method for a motor vehicle mounted with a camera, comprising the steps of: determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, the positioning aid having a certain length; obtaining a photograph taken at a current location, the photograph containing the positioning aid; projecting the positioning aids onto two-dimensional imaging planes of cameras of the vehicles at the estimated positions respectively; calculating the matching degree of the positioning help object projected on the two-dimensional imaging plane of the camera of each virtual position vehicle and the positioning help object in the picture taken by the camera; and setting the estimated current position with the optimal matching degree as the current position.
According to another aspect of the present invention, there is provided a positioning apparatus for a motor vehicle, the motor vehicle being equipped with a camera, comprising: a positioning aid determining unit for determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, wherein the positioning aid has a certain length; a photo obtaining unit that obtains a photo taken at a current position, the photo including the positioning aid; a projection unit that projects the positioning assistance object onto two-dimensional imaging planes of cameras of the vehicles at the respective estimated positions; a matching degree calculation unit that calculates a matching degree of the positioning assistance object projected on a two-dimensional imaging plane of a camera of each virtual position vehicle and the positioning assistance object in a photograph taken by the camera; and a current position setting unit that sets the estimated current position with the optimal matching degree as the current position.
According to one embodiment, the positioning aid determination unit determines the corresponding positioning aid in the map for each estimated current position, or clusters the plurality of estimated current positions, and determines the corresponding positioning aid in the map according to the center position of the cluster.
According to one embodiment, the positioning aid determination unit sets a search box of a convex polygon, determines feature points falling within the search box from a camera position at which the current position is estimated, using a cross multiplication, and determines the corresponding positioning aid from the feature points falling within the search box, wherein the cross multiplication is: connecting each vertex of the convex polygon with the feature point to form a feature point vector, then cross-multiplying the edge of the predetermined direction where each vertex is located and the feature point vector corresponding to each vertex in the clockwise direction or the anticlockwise direction, and determining that the feature point is in the search box when all cross-multiplication results are the same number.
According to one embodiment, the projection unit projects the positioning aids onto the two-dimensional imaging planes of the cameras of the vehicles at the respective estimated positions, respectively, as follows: determining coordinates and a direction of a camera in a map space using a camera rotation matrix representing an installation angle of the camera with respect to the vehicle according to the coordinates of the vehicle and the coordinates and the installation angle of the camera with respect to the vehicle; determining a vector of the positioning aid to a camera according to the coordinates of the positioning aid in a map; multiplying a matrix of the camera rotation matrix in a map coordinate system with the vector; projecting the multiplied vectors onto an imaging plane of the camera.
According to one embodiment, the number of the cameras is one, the number of the positioning aids is one, the matching degree calculation unit takes a predetermined number of points on the positioning aids projected on the two-dimensional plane of each estimated current position, calculates a distance from each point to the positioning aid photographed by the real camera, and determines the matching degree based on the distance; or
The matching degree calculation unit takes a predetermined number of points on each positioning aid, calculates the distance from each point to the positioning aid shot by the real camera, takes the shortest distance in the distances as the distance of each positioning aid, and then determines the matching degree according to the distance; or
The matching degree calculation unit takes a predetermined number of points on the positioning aid for each camera, calculates the distance from each point to the positioning aid photographed by each camera, takes the shortest distance among the distances as the positioning aid distance of each camera, thereby obtaining the positioning aid distances of n cameras, and then determines the matching degree according to the n positioning aid distances, wherein n is an integer greater than 1.
According to one embodiment, when the distance from each point to the positioning aid photographed by the real camera is calculated, the vertical distance is taken as the distance from the point to the positioning aid photographed by the camera when the vertical distance can be calculated, and when the vertical distance cannot be calculated, the line segment distance from the end point of the positioning aid to the point is taken as the distance from the point to the positioning aid photographed by the real camera.
According to one embodiment, the apparatus further comprises an estimated current position acquisition unit that sets a start point position in a position range in which the start point position is known; randomly generating a plurality of virtual positions in a certain range around the starting point position; when the motor vehicle moves, the starting point position and the virtual positions are estimated by using an extended Kalman filtering algorithm; setting the estimated start point position and current positions of the plurality of virtual positions as the estimated current positions.
According to an aspect of the present invention, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of the present invention.
According to an aspect of the present invention, there is provided a computer-readable storage medium on which a device control program is stored, which when executed by a processor implements the method of the present invention.
According to some embodiments of the invention, the positioning of the motor vehicle in the indoor environment can be simply, quickly and accurately completed.
Drawings
The invention may be better understood with reference to the following drawings. The drawings are only schematic and are non-limiting of the scope of the invention.
Fig. 1 is a schematic flow chart illustrating a method for locating a motor vehicle according to an embodiment of the present invention.
FIG. 2 is a flow chart illustrating a method of obtaining an estimated current position according to one embodiment of the present invention.
FIG. 3 is a schematic diagram illustrating cross multiplication according to one embodiment of the invention.
Fig. 4 is a schematic flow diagram illustrating a vehicle positioning device according to an embodiment of the present invention.
Detailed Description
The following describes embodiments of the present invention with reference to the drawings. These descriptions are exemplary and are intended to enable one skilled in the art to practice embodiments of the invention, and are not intended to limit the scope of the invention. Nothing in the description is described as being essential to the actual implementation but is irrelevant to the understanding of the invention.
Fig. 1 is a schematic flow chart illustrating a method for locating a motor vehicle according to an embodiment of the present invention. The motor vehicle according to the method of the invention has a camera which can take images of the surroundings. With a positioning aid in the surrounding environment. The positioning aid has a length. The positioning aid should be an object that is easily captured by the camera and, in image recognition, is used to be recognized. In one embodiment, the positioning aid is, for example, a light tube.
As shown in fig. 1, according to a positioning method for a motor vehicle according to an embodiment of the present invention, first, in step S100, corresponding positioning aids are respectively determined in a map (e.g., a 3-dimensional map or a 3D map) according to a plurality of estimated current positions of the vehicle. For convenience, the following description is directed to the example of a light tube as a positioning aid.
The estimated current position may be a known initial position or a position determined by the method of the present invention in the previous cycle. It may be obtained from a dedicated, presumed current location unit or may be from an external input.
At start-up, the specific location of the vehicle is not known, but the approximate location of the vehicle is known. It is common practice that the parking position (e.g. the parking space) of the vehicle at start-up is known and the coordinates of this position in the map coordinate system are known. But there is some error when the person stops at this position at will. The position error is, for example, approximately 1-2 meters and the heading error is approximately 5 degrees. During movement, an EKF algorithm (extended Kalman filter algorithm) is used to combine with movement information (ABS signals and inertial navigation) of the vehicle to calculate. The estimated location also has a certain error, and the range of the error can be generated according to the covariance and the variance of the EKF. According to one embodiment, the variance and covariance in the x y direction may calculate an error ellipse. Since there are a plurality of estimated current positions, each within the error range, these estimated current positions include the true current position.
FIG. 2 is a flow chart illustrating a method of obtaining an estimated current position according to one embodiment of the present invention. As shown in fig. 2, according to an embodiment of the present invention, first, a starting point (point a) is artificially set, and the position range of the starting point is known. The starting point location includes the position and heading of the vehicle. As described above, the starting point is, for example, the parking space position when the vehicle is started, and only the position range of the starting point is known without guiding the precise range. But the invention does not preclude knowing its specific location.
Virtual positions are then randomly generated within a range around the starting point, the virtual positions including the position and heading of the vehicle. Those skilled in the art will readily appreciate that although virtual, these virtual positions may be the most realistic positions because the assumed starting position is not accurate. The vehicle positions, the starting point position and the virtual point position, may be referred to as the a-point position. For example, it is expected that where the position error at the initial position is 1-2 meters and the heading error is 5 degrees, the range of the point cloud for the virtual position here may be 2-2.5 meters around the estimated current position, heading 10 degrees. Or in a non-initial position, the range of the point cloud of the virtual position is 2 times of the range of the error ellipse.
Then, as shown in fig. 2, after a certain time interval, point B is reached, and the vehicle positions (point a positions) are known to be the positions (point B positions) at which point B is reached according to the EKF algorithm or another algorithm, and these positions are set as the estimated current positions.
According to one embodiment, a search box of a convex polygon is set for each estimated position, feature points falling into the search box are determined from the estimated current position (B point position) in the map of the vehicle by means of cross multiplication, and the corresponding lamp tubes are determined in the map from the feature points falling into the search box.
FIG. 3 is a schematic diagram illustrating cross multiplication according to one embodiment of the invention. In the example of fig. 3, a trapezoid is taken as an example of a convex polygon. As shown in fig. 3, when determining whether or not a feature point in the map is in the search box, each vertex of the convex polygon is connected to the feature point, thereby constituting a feature point vector. In the example in the figure, four feature point vectors are obtained. And then, respectively carrying out cross multiplication on the edge where each vertex is located and the feature point vector corresponding to each vertex along a preset direction. Therefore, when the convex polygon is a trapezoid, four times of cross multiplication are carried out, and when all cross multiplication results are the same number, the characteristic point is determined to be in the search box. And when the preset direction is clockwise, clockwise cross multiplication is performed, and when all cross multiplication results are negative numbers, the characteristic point is determined to be in the search box. This is shown in fig. 3 as a clockwise cross product. And when the preset direction is anticlockwise, performing anticlockwise cross multiplication, and when all cross multiplication results are positive numbers, determining that the feature point is in the search box.
According to one embodiment, the search box is always in front of a camera mounted on the vehicle. According to this embodiment, unnecessary calculations can be reduced, only lamps falling within the search box are considered, and the height of the 3D lamp does not need to be considered.
According to one embodiment, feature points that are shorter than a predetermined distance from the camera are removed when determining the corresponding lamp. In one embodiment, the predetermined distance is 1-2 meters.
By using the search box and determining which points are in the search box according to the cross multiplication, the lamp tube can be found quickly and accurately, and the algorithm difficulty is reduced.
According to one embodiment, the plurality of estimated current positions are clustered, the camera position of the motor vehicle at the cluster center position is determined according to the cluster center position, and the corresponding positioning aid is determined in the map. In determining the corresponding positioning aids, a search box may be employed, using cross multiplication or the like as previously described. With such a method, the calculation load can be further reduced.
Then, in step S200, a photo taken at the current position is acquired, the photo including the positioning aid. It will be understood by those skilled in the art that the photograph may be taken by taking a picture with a camera, and the photograph may be obtained by receiving the photograph from the camera. The identification of the location aid may be performed by various methods for feature identification, now known or known in the future. If the signature marker is not found in the photograph, a new photograph may be taken, or a photograph may be taken after the signature marker is determined to be present. On the other hand, since the general orientation of the feature marker is known, a positioning aid may be included in the photograph based on the location or orientation information of the feature marker.
Then, in step S300, the lamps are projected onto the two-dimensional imaging planes of the cameras of the vehicles at the respective estimated current positions, respectively. According to one embodiment, the light tubes in the map are projected onto the two-dimensional imaging plane of the camera of each virtual location vehicle as follows:
firstly, determining the coordinates and the direction of a camera in a map space by utilizing a camera rotation matrix representing the installation angle of the camera relative to the vehicle according to the coordinates of the vehicle and the coordinates and the installation angle of the camera relative to the vehicle; then, the lamp-to-camera vector is determined according to the coordinates of the lamp in the map. Then, multiplying the matrix of the camera rotation matrix in the map coordinate system with the vector; finally, the multiplied vectors are projected onto the imaging plane of the camera.
Since the lamp has a certain length, the projection is a line segment.
Next, in step S400, the matching degree of the lamp projected on the two-dimensional plane of each virtual position with the lamp photographed by the camera of the real vehicle is calculated.
The shot lamp tube image is imaged on the imaging plane of the camera, so that the lamp tube image can be matched with the lamp tube projected on the two-dimensional plane of each virtual position.
According to one embodiment, a predetermined number of points are taken on the lamp tube projected on the two-dimensional plane at each estimated current position, and the distance from each point to the lamp tube photographed by the camera is calculated. When the vertical distance can be calculated, the vertical distance is taken as the distance from the point to the lamp tube photographed by the camera. When the vertical distance cannot be calculated, for example, when the vertical line is outside the photographed tube, the line segment distance from the tube end point to the point is taken as the distance from the point to the tube photographed by the camera. The average distance can be taken, and the shorter the average distance, the higher the degree of matching.
According to one embodiment, the assigning is done according to the distance, for example when the distance is less than 4 pixels, 4 is given, more than 4 pixels is less than 8 pixels is given 2 points, more than 8 pixels is less than 16 pixels is given 1 point, etc. If 5 points are taken, the higher the total score of these points, the better the match.
According to one embodiment, the number of the positioning aids is k (k is a positive integer greater than 1), a predetermined number of points are taken on each positioning aid, the distance from each point to the positioning aid photographed by the real camera is calculated, the shortest distance among the distances is taken as the distance of each positioning aid, so that 1 positioning aid obtains one distance, k positioning aids obtain k distances, and then the matching degree is determined according to the distances. According to one embodiment, the sum of these k distances is taken, and the smaller the sum, the higher the degree of matching.
According to one embodiment, n cameras may be provided, for each camera, a predetermined number of points are taken on the positioning aid, the distance from each point to the positioning aid photographed by each camera is calculated, the shortest distance among the distances is taken as the positioning aid distance of each camera, thereby obtaining the positioning aid distances of the n cameras, and then the matching degree is determined according to the n positioning aid distances, wherein n is an integer greater than 1. According to one embodiment, the sum of these k distances is taken, and the smaller the sum, the higher the degree of matching.
It will be understood by those skilled in the art that the above can be combined for processing in the case of multiple cameras and multiple lamps (multiple positioning aids in one picture). For example, for a motor vehicle whose current position is estimated, the sum of all these shortest distances may be taken, and the degree of matching may be determined based on the magnitude of the sum.
Finally, in step S500, the estimated current position with the best matching degree is set as the current position and can be used for trajectory generation.
Further, the estimated current position with the best matching degree may be used as the start position, and virtual positions may be generated randomly within a certain range around the start position, as described with reference to fig. 2, so that these virtual positions may be used as the estimated current position to perform position determination and trajectory generation at the next time. According to one embodiment, the estimated current position of the predetermined range in which the degree of matching is poor (lower than the predetermined degree of matching) may be removed. And for the remaining unremoved estimated current position, waiting until the beginning of the next camera matching time can be used for deducing the position of the time according to the motion track of the vehicle. According to this embodiment, the number of virtual positions to be generated can be reduced, and the calculation speed can be increased. In the removal, for example, all the estimated current positions having a matching degree lower than the average matching degree are removed.
Fig. 4 is a schematic flow diagram illustrating a vehicle positioning device according to an embodiment of the present invention. As shown in fig. 4, a vehicle positioning apparatus according to an embodiment of the present invention is used for a vehicle (e.g., an automobile, etc.) equipped with a camera. The motor vehicle positioning device 10 includes: an estimated current position acquisition unit 100 for acquiring a plurality of estimated current positions; a positioning aid determining unit 200 for determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, the positioning aid having a certain length; a photo obtaining unit 300 that obtains a photo taken at a current position, the photo including the positioning aid; a projection unit 400 that projects the positioning assistance objects onto two-dimensional imaging planes of cameras of the respective vehicles at the estimated positions, respectively; a matching degree calculation unit 500 that calculates a matching degree of the positioning aid projected on the two-dimensional imaging plane of the camera of each virtual position vehicle with the positioning aid in the photograph taken by the camera; and current position setting section 600 for setting the estimated current position with the best matching degree as the current position.
According to one embodiment, the positioning aid determination unit 200 determines a corresponding positioning aid in the map for each estimated current position, or clusters the plurality of estimated current positions, and determines a corresponding positioning aid in the map according to the center position of the cluster.
According to one embodiment, the positioning aid determination unit 200 sets a search box of a convex polygon, determines feature points falling in the search box from a camera position at which the current position is estimated, using a cross multiplication of: connecting each vertex of the convex polygon with the feature point to form a feature point vector, then cross-multiplying the edge of the predetermined direction where each vertex is located and the feature point vector corresponding to each vertex in the clockwise direction or the anticlockwise direction, and determining that the feature point is in the search box when all cross-multiplication results are the same number.
According to one embodiment, the projection unit 400 projects the positioning aids onto the two-dimensional imaging planes of the cameras of the respective vehicles at the estimated positions, respectively, as follows: determining coordinates and a direction of a camera in a map space using a camera rotation matrix representing an installation angle of the camera with respect to the vehicle according to the coordinates of the vehicle and the coordinates and the installation angle of the camera with respect to the vehicle; determining a vector of the positioning aid to a camera according to the coordinates of the positioning aid in a map; multiplying a matrix of the camera rotation matrix in a map coordinate system with the vector; projecting the multiplied vectors onto an imaging plane of the camera.
According to one embodiment, the number of the cameras is one, the number of the positioning aids is one, the matching degree calculation unit takes a predetermined number of points on the positioning aids projected on the two-dimensional plane of each estimated current position, calculates a distance from each point to the positioning aid photographed by the real camera, and determines the matching degree based on the distance; or
The matching degree calculation unit takes a predetermined number of points on each positioning aid, calculates the distance from each point to the positioning aid shot by the real camera, takes the shortest distance in the distances as the distance of each positioning aid, and then determines the matching degree according to the distance; or
The matching degree calculation unit takes a predetermined number of points on the positioning aid for each camera, calculates the distance from each point to the positioning aid photographed by each camera, takes the shortest distance among the distances as the positioning aid distance of each camera, thereby obtaining the positioning aid distances of n cameras, and then determines the matching degree according to the n positioning aid distances, wherein n is an integer greater than 1.
According to one embodiment, when the distance from each point to the positioning aid photographed by the real camera is calculated, the vertical distance is taken as the distance from the point to the positioning aid photographed by the camera when the vertical distance can be calculated, and when the vertical distance cannot be calculated, the line segment distance from the end point of the positioning aid to the point is taken as the distance from the point to the positioning aid photographed by the real camera.
According to one embodiment, the estimated current position acquisition unit 100 sets a start point position in a position range in which the start point position is known; randomly generating a plurality of virtual positions in a certain range around the starting point position; when the motor vehicle moves, the starting point position and the virtual positions are estimated by using an extended Kalman filtering algorithm; setting the estimated starting point position and the current positions of the plurality of virtual positions as estimated current positions.
Those skilled in the art will readily appreciate that the above description of the method may be utilized to understand the apparatus of the present invention.
Those skilled in the art will readily appreciate that the method of the present invention may also include other steps corresponding to the functions performed by the apparatus of the present invention. The above steps may also be simplified.
The numbering of the elements and steps of the present invention is for convenience of description only and does not indicate the order of execution unless otherwise indicated in the context.
Those skilled in the art will appreciate that the above units can be implemented by software or special hardware, such as a field programmable gate array, a single chip, or a microchip, or by a combination of software and hardware.
The present invention also provides an electronic device, comprising: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of the present invention.
The invention also relates to a computer software which, when executed by a computing device (such as a single-chip microcomputer, a computer, a CPU, etc.), can implement the method of the invention.
The present invention also relates to a computer software storage device, such as a hard disk, a floppy disk, a flash memory, etc., which stores the above computer software.
The description of the method or steps of the invention may be used for understanding the description of the unit or device, and the description of the unit or device may be used for understanding the method or steps of the invention.
The above description is intended to be illustrative, and not restrictive, and any changes and substitutions that come within the spirit of the invention are desired to be protected.
Claims (10)
1. A method of locating a motor vehicle, said motor vehicle being equipped with a camera, comprising the steps of:
determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, the positioning aid having a certain length;
obtaining a photograph taken at a current location, the photograph containing the positioning aid;
projecting the positioning aids onto two-dimensional imaging planes of cameras of vehicles at the estimated current positions respectively;
calculating the matching degree of the positioning help object projected on the two-dimensional imaging plane of the camera of each vehicle at the estimated current position and the positioning help object in the picture taken by the camera; and
and setting the estimated current position with the optimal matching degree as the current position.
2. The method according to claim 1, wherein for each estimated current position, a corresponding positioning aid is determined in the map, or the plurality of estimated current positions are clustered, and the corresponding positioning aid is determined in the map based on the center position of the cluster.
3. The motor vehicle positioning method according to claim 2, wherein a search box of a convex polygon is set, feature points falling in the search box are determined from a camera position at which a current position is estimated by means of cross multiplication, and a corresponding positioning aid is determined from the feature points falling in the search box,
wherein the cross multiplication is: connecting each vertex of the convex polygon with the feature point to form a feature point vector, then cross-multiplying the edge of the predetermined direction where each vertex is located and the feature point vector corresponding to each vertex in the clockwise direction or the anticlockwise direction, and determining that the feature point is in the search box when all cross-multiplication results are the same number.
4. The motor vehicle positioning method according to claim 1, characterized in that the positioning aids are projected onto the two-dimensional imaging planes of the cameras of the respective vehicles at the estimated positions, respectively, as follows:
determining coordinates and a direction of a camera in a map space using a camera rotation matrix representing an installation angle of the camera with respect to the vehicle according to the coordinates of the vehicle and the coordinates and the installation angle of the camera with respect to the vehicle;
determining a vector of the positioning aid to a camera according to the coordinates of the positioning aid in a map;
multiplying a matrix of the camera rotation matrix in a map coordinate system with the vector;
projecting the multiplied vectors onto an imaging plane of the camera.
5. The motor vehicle positioning method according to claim 1,
the method comprises the steps that one camera is used, one positioning aid is used, a preset number of points are taken from the positioning aid projected on a two-dimensional plane of each estimated current position, the distance from each point to the positioning aid shot by a real camera is calculated, and the matching degree is determined according to the distance; or
The positioning aid is multiple, a predetermined number of points are taken from each positioning aid, the distance from each point to the positioning aid shot by a real camera is calculated, the shortest distance in the distances is taken as the distance of each positioning aid, and then the matching degree is determined according to the distance; or
The number of the cameras is n, a predetermined number of points are taken on the positioning aid for each camera, the distance from each point to the positioning aid shot by each camera is calculated, the shortest distance in the distances is taken as the positioning aid distance of each camera, so that the positioning aid distances of the n cameras are obtained, then the matching degree is determined according to the n positioning aid distances, and n is an integer larger than 1.
6. The method according to claim 5, wherein when calculating the distance from each point to the positioning aid photographed by the real camera, if the vertical distance can be calculated, the vertical distance is taken as the distance from the point to the positioning aid photographed by the camera, and when the vertical distance cannot be calculated, the distance from the end point of the positioning aid to the point is taken as the distance from the point to the positioning aid photographed by the real camera.
7. The method of locating a motor vehicle of claim 6, further comprising the steps of:
setting a starting point position, wherein the position range of the starting point position is known;
randomly generating a plurality of virtual positions in a certain range around the starting point position;
when the motor vehicle moves, the starting point position and the virtual positions are estimated by using an extended Kalman filtering algorithm;
setting the estimated start point position and current positions of the plurality of virtual positions as the estimated current positions.
8. A motor vehicle locating device, the motor vehicle having a camera mounted thereon, comprising:
a positioning aid determining unit for determining a corresponding positioning aid in a map according to a plurality of estimated current positions of the motor vehicle, wherein the positioning aid has a certain length;
a photo obtaining unit that obtains a photo taken at a current position, the photo including the positioning aid;
a projection unit that projects the positioning assistance object onto two-dimensional imaging planes of cameras of the vehicles at the respective estimated positions;
a matching degree calculation unit that calculates a matching degree of the positioning assistance object projected on a two-dimensional imaging plane of a camera of each virtual position vehicle and the positioning assistance object in a photograph taken by the camera; and
and a current position setting unit for setting the estimated current position with the optimal matching degree as the current position.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a device control program is stored, which, when executed by a processor, implements the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210107324.XA CN114119760B (en) | 2022-01-28 | 2022-01-28 | Motor vehicle positioning method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210107324.XA CN114119760B (en) | 2022-01-28 | 2022-01-28 | Motor vehicle positioning method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114119760A true CN114119760A (en) | 2022-03-01 |
CN114119760B CN114119760B (en) | 2022-06-14 |
Family
ID=80361705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210107324.XA Active CN114119760B (en) | 2022-01-28 | 2022-01-28 | Motor vehicle positioning method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114119760B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120050525A1 (en) * | 2010-08-25 | 2012-03-01 | Lakeside Labs Gmbh | Apparatus and method for generating an overview image of a plurality of images using a reference plane |
CN110827353A (en) * | 2019-10-18 | 2020-02-21 | 天津大学 | Robot positioning method based on monocular camera assistance |
CN110969055A (en) * | 2018-09-29 | 2020-04-07 | 百度在线网络技术(北京)有限公司 | Method, apparatus, device and computer-readable storage medium for vehicle localization |
CN111830953A (en) * | 2019-04-12 | 2020-10-27 | 北京四维图新科技股份有限公司 | Vehicle self-positioning method, device and system |
CN113256719A (en) * | 2021-06-03 | 2021-08-13 | 舵敏智能科技(苏州)有限公司 | Parking navigation positioning method and device, electronic equipment and storage medium |
-
2022
- 2022-01-28 CN CN202210107324.XA patent/CN114119760B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120050525A1 (en) * | 2010-08-25 | 2012-03-01 | Lakeside Labs Gmbh | Apparatus and method for generating an overview image of a plurality of images using a reference plane |
CN110969055A (en) * | 2018-09-29 | 2020-04-07 | 百度在线网络技术(北京)有限公司 | Method, apparatus, device and computer-readable storage medium for vehicle localization |
CN111830953A (en) * | 2019-04-12 | 2020-10-27 | 北京四维图新科技股份有限公司 | Vehicle self-positioning method, device and system |
CN110827353A (en) * | 2019-10-18 | 2020-02-21 | 天津大学 | Robot positioning method based on monocular camera assistance |
CN113256719A (en) * | 2021-06-03 | 2021-08-13 | 舵敏智能科技(苏州)有限公司 | Parking navigation positioning method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
VEDRAN JELAČA等: ""Vehicle matching in smart camera networks using image projection profiles at multiple instances"", 《IMAGE AND VISION COMPUTING》 * |
叶长华等: ""结合航拍图与车载图像的车辆路面位置定位"", 《科学技术与工程》 * |
Also Published As
Publication number | Publication date |
---|---|
CN114119760B (en) | 2022-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102003152B1 (en) | Information processing method, device, and terminal | |
WO2018048353A1 (en) | Simultaneous localization and mapping methods and apparatus | |
CN112444242A (en) | Pose optimization method and device | |
JP6479296B2 (en) | Position / orientation estimation apparatus and position / orientation estimation method | |
JP7424390B2 (en) | Image processing device, image processing method, and image processing program | |
JP7280385B2 (en) | Visual positioning method and related apparatus, equipment and computer readable storage medium | |
JP2020057387A (en) | Vehicle positioning method, vehicle positioning device, electronic apparatus, and computer-readable storage medium | |
JP2017151148A (en) | Position estimation device, position detection method, and program | |
CN112132754B (en) | Vehicle movement track correction method and related device | |
JP6410231B2 (en) | Alignment apparatus, alignment method, and computer program for alignment | |
CN113034347B (en) | Oblique photography image processing method, device, processing equipment and storage medium | |
JP3863014B2 (en) | Object detection apparatus and method | |
CN112150550B (en) | Fusion positioning method and device | |
CN114119760B (en) | Motor vehicle positioning method and device, electronic equipment and storage medium | |
CN114830185A (en) | Position determination by means of a neural network | |
US20220277480A1 (en) | Position estimation device, vehicle, position estimation method and position estimation program | |
CN114117113B (en) | Multi-feature-point motor vehicle positioning method and device, electronic equipment and storage medium | |
JP7179687B2 (en) | Obstacle detector | |
CN114119761B (en) | Multi-camera motor vehicle positioning method and device, electronic equipment and storage medium | |
CN114119759B (en) | Method and device for positioning a vehicle in multiple positions, electronic device and storage medium | |
CN114120701B (en) | Parking positioning method and device | |
JP7117408B1 (en) | POSITION CALCULATION DEVICE, PROGRAM AND POSITION CALCULATION METHOD | |
CN115205828B (en) | Vehicle positioning method and device, vehicle control unit and readable storage medium | |
JP7334489B2 (en) | Position estimation device and computer program | |
WO2022269875A1 (en) | Information processing device, information processing method, and information processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |