CN115587151A - Method and apparatus for using shared SLAM map for vehicle - Google Patents

Method and apparatus for using shared SLAM map for vehicle Download PDF

Info

Publication number
CN115587151A
CN115587151A CN202110759490.3A CN202110759490A CN115587151A CN 115587151 A CN115587151 A CN 115587151A CN 202110759490 A CN202110759490 A CN 202110759490A CN 115587151 A CN115587151 A CN 115587151A
Authority
CN
China
Prior art keywords
vehicle
position information
point
sub
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110759490.3A
Other languages
Chinese (zh)
Inventor
刘烨航
谭龙庆
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lingdong Technology Beijing Co Ltd
Original Assignee
Lingdong Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lingdong Technology Beijing Co Ltd filed Critical Lingdong Technology Beijing Co Ltd
Priority to CN202110759490.3A priority Critical patent/CN115587151A/en
Priority to PCT/CN2022/094885 priority patent/WO2023279878A1/en
Publication of CN115587151A publication Critical patent/CN115587151A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Remote Sensing (AREA)
  • Algebra (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a method and a device for using a shared SLAM map for a vehicle. The method comprises the following steps: acquiring first position information of at least two points on a preset driving path of the vehicle relative to the vehicle and second position information relative to a creating vehicle for creating a shared SLAM map; obtaining third position information of at least one mark on the vehicle relative to the vehicle and fourth position information relative to a creating vehicle; and determining position conversion information according to the first position information, the second position information, the third position information and the fourth position information. According to the method and apparatus for a vehicle using a shared SLAM map of the present invention, it is possible to efficiently and accurately determine position conversion information capable of representing a detected difference between the host vehicle and the creating vehicle, by the position information of the same point on the travel path respectively detected by the host vehicle and the creating vehicle, and the identification information on the host vehicle.

Description

Method and apparatus for using shared SLAM map for vehicle
Technical Field
The present invention relates to the field of positioning, and in particular, to a method and apparatus for using a shared SLAM map for a vehicle.
Background
The smart travel device (e.g., AGV, AMR) and the like can create a SLAM map by different SLAM methods during travel, thereby facilitating travel of the smart travel device (e.g., AGV, AMR).
However, since coordinate systems (e.g., camera, lidar coordinate system) used by different intelligent traveling apparatuses to create the SLAM map are different, the SLAM maps created by different traveling apparatuses in the same space are not the same. Therefore, the SLAM map created by the travel apparatus can be used only by itself, and the other travel apparatuses cannot use the SLAM map created by them.
Disclosure of Invention
The invention aims to provide a method and a device for using a shared SLAM map for a vehicle.
According to an aspect of the present invention, there is provided a method for a vehicle using a shared SLAM map, the method including: acquiring first position information of at least two points on a predetermined travel path of the vehicle relative to the vehicle and second position information relative to a creating vehicle for creating a shared SLAM map; obtaining third position information of at least one mark on the vehicle relative to the vehicle and fourth position information relative to a creating vehicle; and determining position conversion information according to the first position information, the second position information, the third position information and the fourth position information, wherein the position conversion information is used for converting any point detected by the vehicle on the preset running path into a corresponding point in the shared SLAM map.
Alternatively, first position information of the at least two points detected by the vehicle at the at least two detection positions, respectively, and second position information of the at least two points detected by the creation vehicle at the at least two detection positions, respectively, are acquired by synchronizing the vehicle and the creation vehicle traveling on the predetermined travel path.
Optionally, the fourth position information of at least one identifier on the vehicle detected by the vehicle when the vehicle detects the at least two points respectively is acquired by the vehicle; obtaining third position information of the at least one marker detected by the vehicle and/or obtaining third position information of the at least one marker detected by the vehicle through a marker database, wherein the marker database comprises the respective markers on the vehicle and the position information of each marker relative to the vehicle.
Optionally, the at least two points include a first point and a second point, and the first location information includes: first sub-positional information of a first point relative to the vehicle and second sub-positional information of a second point relative to the vehicle, the second positional information comprising: third sub-position information of the first point relative to the creation vehicle and fourth sub-position information of the second point relative to the creation vehicle.
Optionally, the at least one marker includes a first marker and a second marker, wherein the first marker and the second marker represent respectively one marker on the vehicle detected by the creation vehicle corresponding to the first point and the second point, wherein the third location information includes: fifth sub-position information of the first marker relative to the vehicle and sixth sub-position information of the second marker relative to the vehicle; the fourth location information includes: seventh sub-position information of the first marker relative to the creating vehicle and eighth sub-position information of the second marker relative to the creating vehicle.
Optionally, the vehicle comprises a single identity, wherein the first identity and the second identity represent the identity on the vehicle detected by the creating vehicle corresponding to the first point and the second point, respectively.
Optionally, the vehicle includes a plurality of markers, wherein the first marker and the second marker respectively represent one of two or more markers on the vehicle detected by the creating vehicle corresponding to the first point and the second point, which is closest to the creating vehicle.
Optionally, each of the plurality of identities has a respective identification code, the method further comprising: acquiring an identification code of a first identifier and an identification code of a second identifier for creating vehicle detection, wherein fifth sub-position information and sixth sub-position information are acquired through an identification database and/or fifth sub-position information and sixth sub-position information of the vehicle detection are acquired through the vehicle according to the identification codes of the first identifier and the second identifier, wherein the identification database comprises the respective identifiers, the identification code corresponding to each identifier and the position information of each identifier relative to the vehicle.
Optionally, the predetermined driving route is a straight driving route, wherein the step of determining the position conversion information according to the first position information, the second position information, the third position information and the fourth position information includes: determining first position conversion information according to the first sub-position information, the third sub-position information, the fifth sub-position information and the seventh sub-position information; determining second position conversion information according to the second sub-position information, the fourth sub-position information, the sixth sub-position information and the eighth sub-position information; and determining the position conversion information according to the first sub-position information, the second sub-position information, the first position conversion information and the second position conversion information.
Optionally, the first sub-position information is a first coordinate matrix of the first point relative to the first coordinate system, the second sub-position information is a second coordinate matrix of the second point relative to the first coordinate system, the third sub-position information is a third coordinate matrix of the first point relative to the second coordinate system, the fourth sub-position information is a fourth coordinate matrix of the second point relative to the second coordinate system, the fifth sub-position information is a first identifier transformation matrix from the first identifier to an origin of the first coordinate system, the sixth sub-position information is a second identifier transformation matrix from the second identifier to the origin of the first coordinate system, the seventh sub-position information is a fifth coordinate matrix from the first identifier to the second coordinate system, and the eighth sub-position information is a sixth coordinate matrix from the second identifier to the second coordinate system, where the first coordinate system is a detection coordinate system of the vehicle and the second coordinate system is a detection coordinate system of the vehicle.
Alternatively, the coordinate matrix X is expressed by the following equation F/A =X(x F/A ,y F/A ,θ F/A ) To represent a first coordinate matrix X F_i =X(x F_i ,y F_i ,θ F_i ) A second coordinate matrix X F_j =X(x F_j ,y F_j ,θ F_j ) A third coordinate matrix X A_i =X(x A_i ,y A_i ,θ A_i ) The fourth coordinate matrix X A_j =X(x A_j ,y A_j ,θ A_j ) The fifth coordinate matrix
Figure BDA0003148047350000031
And a sixth coordinate matrix
Figure BDA0003148047350000032
Figure BDA0003148047350000033
Wherein x is F/A For representing x F_i 、x F_j 、x A_i 、x A_j
Figure BDA0003148047350000034
Or
Figure BDA0003148047350000035
y F/A For representing the corresponding y F_i 、y F_j 、y A_i 、x A_j
Figure BDA0003148047350000038
Or
Figure BDA0003148047350000036
θ F/A For representing the corresponding theta F_i 、θ F_j 、θ A_i 、θ A_j
Figure BDA0003148047350000039
Or
Figure BDA0003148047350000037
Wherein x is F_i And y F_i Respectively representing the abscissa and ordinate, theta, of a first point i in a first x-y coordinate plane of a first coordinate system F F_j Indicates that the first point i is oppositeAngle in a first x-y coordinate plane, x F_j And y F_j Respectively representing the abscissa and the ordinate, theta, of the second point j in the first x-y coordinate plane F_j Representing the angle of the second point j with respect to the first x-y coordinate plane, x A_i And y A_i Respectively representing the abscissa and ordinate, theta, of the first point i in a second x-y coordinate plane of a second coordinate system A A_i Representing the angle of a first point i with respect to a second x-y coordinate plane, x A_j And y A_j Respectively representing the abscissa and ordinate, theta, of a second point j in a second x-y coordinate plane A_j Representing the angle of the second point j with respect to the second x-y coordinate plane,
Figure BDA0003148047350000041
and
Figure BDA0003148047350000042
respectively represent first marks M On the abscissa and ordinate of the second x-y coordinate plane,
Figure BDA0003148047350000043
represents a first mark M The angle relative to the second x-y coordinate plane,
Figure BDA0003148047350000044
and
Figure BDA0003148047350000045
respectively represent a second mark M p On the abscissa and ordinate of the second x-y coordinate plane,
Figure BDA0003148047350000046
represents a second mark M p Angle relative to a second x-y coordinate plane.
Optionally, the first identity transformation matrix is represented by
Figure BDA0003148047350000047
Figure BDA0003148047350000048
Wherein the second identifier conversion matrix is represented by
Figure BDA0003148047350000049
Figure BDA00031480473500000410
Wherein the content of the first and second substances,
Figure BDA00031480473500000411
and
Figure BDA00031480473500000412
respectively represent a first mark M n On the abscissa and ordinate of a first x-y coordinate plane of the first coordinate system F,
Figure BDA00031480473500000413
represents a first mark M n The angle relative to the first x-y coordinate plane,
Figure BDA00031480473500000414
and
Figure BDA00031480473500000415
respectively represent second marks M p On the abscissa and ordinate of the first x-y coordinate plane,
Figure BDA00031480473500000416
represents a second mark M p Angle relative to a first x-y coordinate plane.
Alternatively, the first position conversion information is determined by the following equation
Figure BDA00031480473500000417
Figure BDA0003148047350000051
Wherein the second position conversion information is determined by the following equation
Figure BDA0003148047350000052
Figure BDA0003148047350000053
Alternatively, the position conversion information is determined by the following equation
Figure BDA0003148047350000054
Figure BDA0003148047350000055
Wherein the content of the first and second substances,
Figure BDA0003148047350000056
Figure BDA0003148047350000057
Figure BDA0003148047350000058
wherein the content of the first and second substances,
Figure BDA0003148047350000059
Figure BDA00031480473500000510
wherein, X F_k And y F_k Respectively representing the abscissa and ordinate of an arbitrary point k detected by the vehicle on the predetermined travel path on a first x-y coordinate plane of a first coordinate system FAnd (4) marking.
Alternatively, the arbitrary point k is detected on the predetermined travel path of the vehicle, and an arbitrary point coordinate matrix X of the arbitrary point k is obtained, which is expressed by the following equation F_k The method comprises the following steps:
Figure BDA00031480473500000511
determining a corresponding point matrix X of the corresponding points of the arbitrary point k in the shared SLAM map by the following equation A_k
Figure BDA0003148047350000061
Wherein, theta F_k Represents the angle of said arbitrary point k with respect to a first x-y coordinate plane of a first coordinate system F, wherein x is A_k And y A_k Respectively representing the abscissa and the ordinate of the corresponding point k in the shared SLAM map on a second x-y coordinate plane of a second coordinate system A, theta A_k Representing the angle of the corresponding point of said arbitrary point k in the shared SLAM map with respect to the second x-y coordinate plane.
According to another aspect of the present invention, there is provided an apparatus for a vehicle using a shared SLAM map, the apparatus including: a first acquisition unit configured to be able to acquire first position information with respect to the vehicle and second position information with respect to a creating vehicle for creating a shared SLAM map of at least two points on a predetermined travel path of the vehicle; a second acquisition unit configured to be able to acquire third position information of at least one logo on the vehicle relative to the vehicle and fourth position information relative to a creation vehicle; a determination unit configured to be able to determine position conversion information for converting an arbitrary point detected by the vehicle on the predetermined travel path into a corresponding point in a shared SLAM map, from first position information, second position information, third position information, and fourth position information.
According to another aspect of the present invention, a computer program product is provided, wherein the computer program product comprises a computer program which, when executed by a processor, causes the processor to implement the method for using a shared SLAM map for a vehicle according to the present invention.
According to the method and apparatus for a vehicle using a shared SLAM map of the present invention, it is possible to efficiently and accurately determine position conversion information capable of representing a detection difference between the own vehicle and the creation vehicle by position information of the same point on a travel path respectively detected by the own vehicle and the creation vehicle and identification information on the own vehicle, so that the own vehicle can accurately use a shared SLAM map created by other creation vehicles.
Drawings
The foregoing and other aspects of the invention will be more fully understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
fig. 1 shows a flowchart of a method for a vehicle using a shared SLAM map according to an exemplary embodiment of the present invention.
FIG. 2 shows a schematic diagram of a plurality of markers on a vehicle according to an exemplary embodiment of the present invention.
Fig. 3 is a flowchart illustrating a step of determining location conversion information in a method for a vehicle using a shared SLAM map according to an exemplary embodiment of the present invention.
Fig. 4 shows a block diagram of an apparatus for a vehicle using a shared SLAM map according to an exemplary embodiment of the present invention.
Detailed Description
In the following, some exemplary embodiments of the invention will be described in more detail with reference to the accompanying drawings in order to better understand the basic ideas and advantages of the invention.
Fig. 1 shows a flowchart of a method for a vehicle using a shared SLAM map according to an exemplary embodiment of the present invention.
In step S1, first position information of at least two points on a predetermined travel path of the vehicle with respect to the vehicle and second position information with respect to a creating vehicle for creating a shared SLAM map are acquired.
In one embodiment, the predetermined travel path may be a straight travel path.
For example, the creating vehicle used to create the shared SLAM map may be an AGV, AMR, or the like that runs in a warehouse. The vehicle that needs to use the shared SLAM map (also referred to herein as the "own vehicle" or "the vehicle" for ease of description) may be a vehicle that is capable of traveling back and forth on one or more straight travel paths, such as a forklift truck or the like. For example, forklifts typically travel in straight trips within a warehouse to handle goods.
To perform the method illustrated in fig. 1, as an example, the host vehicle and the creating vehicle may be caused to travel together, i.e., to be mixed on the same spot, on a straight path on which the host vehicle needs to travel back and forth, so that the host vehicle and the creating vehicle can detect the same point on the path or detect each other.
In one embodiment, in step S1, first position information of the at least two points detected by the vehicle at the at least two detection positions, respectively, and second position information of the at least two points detected by the creation vehicle at the at least two detection positions, respectively, may be acquired by synchronizing the vehicle and the creation vehicle traveling on the predetermined travel path.
In other words, the first position information acquired in step S1 may be position information of a point on a straight-traveling path detected by the own vehicle (e.g., forklift), and the second position information may be position information of the same point detected by a creating vehicle (e.g., AGV, AMR) traveling together with the own vehicle.
In one embodiment, the host vehicle and the creating vehicle may detect only two points on the predetermined travel path, that is, the at least two points may include only the first point and the second point.
At this time, as an example, the first location information may include: first sub-positional information of a first point relative to the vehicle and second sub-positional information of a second point relative to the vehicle. The second location information may include: third sub-position information of the first point relative to the creation vehicle and fourth sub-position information of the second point relative to the creation vehicle.
Simultaneously with or before or after performing step S1, step S2 may be performed, obtaining third position information of at least one tag on the vehicle relative to the vehicle and fourth position information relative to a creating vehicle.
In the case where the vehicle and the creating vehicle travel on the predetermined travel path in synchronization in the above example, in one embodiment, fourth position information of at least one identification on the vehicle detected by the creating vehicle when the vehicle detects the at least two points, respectively, may be acquired by creating a vehicle in step S2. The third location information of the at least one identifier detected by the vehicle may then be obtained by an identifier database and/or by the vehicle. The identification database may include individual identifications on the vehicle and location information for each identification relative to the vehicle.
In one embodiment, the vehicle, i.e., the own vehicle, may include a single or multiple identifiers. For example, each identification of the own vehicle may have a corresponding ID, and position information of each identification with respect to the own vehicle, such as the above third position information, may be obtained in advance and stored in correspondence with the ID of the identification.
For example, the plurality of marks on the host vehicle may be members of different portions of the host vehicle, or may be a plurality of marks provided at different positions on the host vehicle, for example, pasted.
FIG. 2 shows a schematic diagram of a plurality of markers on a vehicle according to an exemplary embodiment of the present invention.
Fig. 2 shows the vehicle as a forklift. As can be seen from the top view of the forklift shown in fig. 2, four marks M are arranged, for example, adhered, in four directions, i.e., front, rear, left, and right directions, of the forklift 1 、M 2 、M 3 And M 4
It should be understood that the number and the arrangement positions of the signs shown in fig. 2 are only examples, more or less signs may be arranged at different positions according to actual needs, and the signs arranged on the vehicle may have any form, such as vehicle parts, reflective strips, two-dimensional codes, and the like.
In the case where the step S1 in fig. 1 detects only the position information of two points on the path, namely, the first point and the second point, in one embodiment, the at least one identifier in the step S2 may include a first identifier and a second identifier, wherein the first identifier and the second identifier may respectively represent one identifier detected by the creation vehicle corresponding to the first point and the second point. The third location information may include: fifth sub-position information of the first marker relative to the vehicle and sixth sub-position information of the second marker relative to the vehicle; the fourth location information may include: seventh sub-position information of the first marker relative to the creating vehicle and eighth sub-position information of the second marker relative to the creating vehicle.
In one embodiment, in case the vehicle (own vehicle) comprises a single identification, the first identification and the second identification may represent said identifications on said vehicle detected by the creating vehicle corresponding to the first point and the second point, respectively.
In other words, the first marker and the second marker may now be different representations of the same marker on the vehicle at different detection locations.
In another embodiment, where the vehicle includes a plurality of identifiers, the first identifier and the second identifier may represent the closest one of two or more identifiers on the vehicle detected by the creating vehicle corresponding to the first point and the second point, respectively, to the creating vehicle.
At this time, each of the plurality of identifications may have a corresponding identification code. In this case, in one embodiment, the method for using a shared SLAM map for a vehicle according to the present invention may further include: an identification code of a first identification and an identification code of a second identification that create a vehicle detection are acquired.
In this case, in step S2, the fifth sub-position information and the sixth sub-position information detected by the vehicle may be acquired through the identification database and/or acquired through the vehicle according to the identification code of the first identifier and the identification code of the second identifier. At this time, the identification database includes the respective identifications, identification codes corresponding to each of the identifications, and position information of each of the identifications with respect to the vehicle.
For example, in a process in which the host vehicle and the creating vehicle travel together on the path, when the host vehicle detects (to acquire first position information) a point (for example, a first point or a second point) on the path, the creating vehicle may determine the current position of the creating vehicle (that is, detect the same first point or second point to acquire second position information) and detect an identifier on the host vehicle, and when two or more identifiers are detected, determine the position of the closest one identifier (that is, the corresponding first identifier or second identifier) with respect to the creating vehicle (that is, fourth position information) and the position of the identifier (that is, the identifier stored in advance is acquired by, for example, an identification code (ID) of the identifier) with respect to the host vehicle (that is, third position information).
At this time, in order to avoid the creating vehicle from detecting the identifications of other vehicles than the host vehicle and erroneously determining them as the identification of the host vehicle, the detected identifications may be determined as the identification of the host vehicle only in the case where the creating vehicle detects the identifications of two or more host vehicles at a time. Also, the creating vehicle may determine only one of the detected identifiers (e.g., the nearest identifier) with respect to the position information of the creating vehicle, i.e., the fourth position information.
After obtaining the above information in steps S1 and S2, step S3 may be performed to determine position conversion information for converting any point detected by the vehicle on the predetermined travel path into a corresponding point in the shared SLAM map, based on the first position information, the second position information, the third position information, and the fourth position information.
Fig. 3 shows a flowchart of step S3 of determining location conversion information in the method of using a shared SLAM map for a vehicle according to an exemplary embodiment of the present invention.
Referring to fig. 3, in step S31, the first position conversion information may be determined based on the first sub-position information (of the first point with respect to the host vehicle), the third sub-position information (of the first point with respect to the creating vehicle), the fifth sub-position information (of the first marker with respect to the host vehicle), and the seventh sub-position information (of the first marker with respect to the creating vehicle).
In other words, the first position conversion information is conversion information obtained by the position information of the first point with respect to the own vehicle, the creation vehicle, and the position information of the first marker with respect to the own vehicle, the creation vehicle.
In step S32, the second position transition information may be determined based on the second sub-position information (of the second point with respect to the host vehicle), the fourth sub-position information (of the second point with respect to the creating vehicle), the sixth sub-position information (of the second mark with respect to the host vehicle), and the eighth sub-position information (of the second mark with respect to the creating vehicle).
In other words, the second position conversion information is conversion information obtained by the position information of the second point with respect to the own vehicle, the creation vehicle, and the position information of the second mark with respect to the own vehicle, the creation vehicle.
It should be understood that the identifiers used in step S31 and step S32, i.e., the first identifier and the second identifier detected by the creating vehicle, may be the same or different.
In step S33, the position conversion information is determined according to the first sub-position information, the second sub-position information, the first position conversion information, and the second position conversion information.
Here, by using the respective position information (the first to fourth position information) of both the two points on the travel path of the own vehicle and the mark on the own vehicle with respect to the own vehicle and the creating vehicle, the detection difference between the own vehicle and the creating vehicle, that is, the above position conversion relationship can be accurately determined, so that the own vehicle can quickly and accurately represent any point on the travel path detected by the own vehicle as a corresponding point in the shared SLAM map through the position conversion relationship, that is, so that the own vehicle can efficiently and accurately use the shared SLAM map created by the other vehicle.
In the following, the method shown in fig. 1 is further described by means of corresponding formulaic expressions.
Each piece of sub-position information included in the first position information and the second position information obtained in step S1 may be information obtained by the own vehicle or creating corresponding point coordinate information detected by the vehicle.
In one embodiment, the first sub-position information in the first position information may be a first coordinate matrix of the first point with respect to a first coordinate system. The second sub-position information in the first position information may be a second coordinate matrix of the second point with respect to the first coordinate system. The third sub-position information in the second position information may be a third coordinate matrix of the first point with respect to the second coordinate system. The fourth sub-position information in the second position information may be a fourth coordinate matrix of the second point with respect to the second coordinate system.
The fifth sub-position information in the third position information obtained in step S2 may be a first identifier conversion matrix from the first identifier to the origin of the first coordinate system. The sixth sub-position information in the third position information may be a second identity transformation matrix from the second identity to the origin of the first coordinate system. The seventh sub-position information in the fourth position information may be a fifth coordinate matrix of the first marker relative to the second coordinate system. And the eighth sub-position information in the fourth position information is a sixth coordinate matrix of the second identifier relative to the second coordinate system.
At this time, the first coordinate system may be a detection coordinate system of the vehicle (own vehicle), and the second coordinate system may be a detection coordinate system of the creating vehicle.
For example, each of the above coordinate matrices may represent a corresponding point or a translation relationship identifying an origin to a corresponding coordinate system.
In one embodiment, the above first to sixth coordinate matrices may each have a coordinate matrix X represented by the following equation (1) F/A =X(x F/A ,y F/A ,θ F/A ) In the form of (c):
Figure BDA0003148047350000111
as an example, the first coordinate matrix may be represented as X F_i =X(x F_i ,y F_i ,θ F_i ). The second coordinate matrix may be represented as X F_j =X(x F_j ,y F_j ,θ F_j ). The third coordinate matrix may be represented as X A_i =X(x A_i ,y A_i ,θ A_i ). The fourth coordinate matrix may be represented as X A_j =X(x A_j ,y A_j ,θ A_j ). The fifth coordinate matrix can be expressed as
Figure BDA0003148047350000121
The sixth coordinate matrix can be expressed as
Figure BDA0003148047350000122
In other words, x in equation (1) as generalized above F/A Can be used to represent x F_i 、x F_j 、x A_i 、x A_j
Figure BDA0003148047350000123
Or
Figure BDA0003148047350000124
Y in the above equation (1) F/A Can be used to represent corresponding y F_i 、y F_j 、y A_i 、x A_j
Figure BDA0003148047350000125
Or
Figure BDA0003148047350000126
θ in the above equation (1) F/A Can be used to represent the corresponding theta F_i 、θ F_j 、θ A_i 、θ A_j
Figure BDA0003148047350000127
Or
Figure BDA0003148047350000128
At this time, in the first coordinate matrixx F_i And y F_i Respectively, may represent the abscissa and ordinate of the first point i in a first x-y coordinate plane of a first coordinate system F. Theta F_i May represent the angle of the first point i with respect to the first x-y coordinate plane. That is, the coordinates of the first point i in the first coordinate system F may be (X) F_i ,Y F_i ,θ F_i )。
Further, X in the second coordinate matrix F_j And y F_j May represent the abscissa and ordinate, respectively, of the second point j in the first x-y coordinate plane. Theta F_j May represent the angle of the second point j relative to the first x-y coordinate plane. That is, the coordinates of the second point j in the first coordinate system F may be (X) F_j ,y F_j ,θ F_j )。
Furthermore, x in the third coordinate matrix A_i And y A_i Respectively, may represent the abscissa and ordinate of the first point i in a second x-y coordinate plane of the second coordinate system a. Theta A_i May represent the angle of the first point i relative to the second x-y coordinate plane. That is, the coordinates of the first point i in the second coordinate system a may be (x) A_i ,y A_i ,θ A_i )。
Furthermore, x in the fourth coordinate matrix A_j And y A_j May represent the abscissa and ordinate, respectively, of the second point j in a second x-y coordinate plane. Theta A_j May represent the angle of the second point j relative to the second x-y coordinate plane. That is, the coordinates of the second point j in the second coordinate system a may be (x) A_j ,y A_j ,θ A_j )。
In addition, in the fifth coordinate matrix
Figure BDA0003148047350000131
And
Figure BDA0003148047350000132
can respectively represent first marks M n The abscissa and ordinate of a second x-y coordinate plane in a second coordinate system a.
Figure BDA0003148047350000133
Can represent a first mark M Angle relative to a second x-y coordinate plane. I.e. the first identifier M The coordinates in the second coordinate system a may be
Figure BDA0003148047350000134
Further, in the sixth coordinate matrix
Figure BDA0003148047350000135
And
Figure BDA0003148047350000136
can respectively represent second marks M p On the abscissa and ordinate of the second x-y coordinate plane,
Figure BDA0003148047350000137
can represent a second identifier M Angle relative to a second x-y coordinate plane. I.e. the second identifier M The coordinates in the second coordinate system a may be
Figure BDA0003148047350000138
Figure BDA0003148047350000139
It should be understood that the above defined abscissas, ordinates and angles are only examples for locating the position of the respective point or marker in the respective coordinate system (e.g. in the camera coordinate system of a forklift, in the camera or lidar coordinates of an AGV or AMR vehicle), and that any other way of defining the respective point or marker in the respective coordinate system may be used according to practical requirements.
In the case where the first to sixth coordinate matrices have the matrix form shown in the above equation (1), in one embodiment, the first identity conversion matrix in the third location information obtained in step S2 may be represented by the following matrix (2)
Figure BDA00031480473500001310
Figure BDA00031480473500001311
The second identity conversion matrix in the third location information obtained in step S2 can be expressed by the following equation (3)
Figure BDA00031480473500001312
Figure BDA0003148047350000141
At this time, the process of the present invention,
Figure BDA0003148047350000142
and
Figure BDA0003148047350000143
can respectively represent first marks M n The abscissa and ordinate of a first x-y coordinate plane in a first coordinate system F.
Figure BDA0003148047350000144
Represents a first mark M n Angle relative to a first x-y coordinate plane. That is, the coordinates of the first mark Mn in the first coordinate system F may be
Figure BDA0003148047350000145
Figure BDA0003148047350000146
Figure BDA0003148047350000147
And
Figure BDA0003148047350000148
can respectively represent second marks M p On the abscissa and ordinate of the first x-y coordinate plane,
Figure BDA0003148047350000149
represents a second mark M Angle relative to the first x-y coordinate plane. I.e. the second identifier M The coordinates in the first coordinate system F may be
Figure BDA00031480473500001410
Thereafter, in step S31, the first position conversion information may be determined by the following equation (4)
Figure BDA00031480473500001411
Figure BDA00031480473500001412
In step S32, the second position conversion information may be determined by the following equation (5)
Figure BDA00031480473500001413
Figure BDA00031480473500001414
It should be understood that the identification conversion matrices in equations (4) and (5) above are converted when the creating vehicle detects the same nearest identification on the host vehicle corresponding to the first point and the second point, i.e., the first identification and the second identification are the same
Figure BDA00031480473500001415
And
Figure BDA00031480473500001416
may be the same.
Thereafter, in step S33, the position conversion information may be determined by the following equation (6)
Figure BDA00031480473500001417
Figure BDA00031480473500001418
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA00031480473500001419
Figure BDA0003148047350000151
Figure BDA0003148047350000152
wherein the content of the first and second substances,
Figure BDA0003148047350000153
Figure BDA0003148047350000154
wherein, X F_k And y F_k Respectively representing the abscissa and ordinate of an arbitrary point k detected by the vehicle on the predetermined travel path on a first x-y coordinate plane of a first coordinate system F (of the vehicle).
At the above position conversion information
Figure BDA0003148047350000155
Then, while the host vehicle is traveling on the predetermined travel path (straight travel path), the position of the arbitrary point k detected by the host vehicle in the shared SLAM map is determined.
In one embodiment, the arbitrary point k is detected on the predetermined travel path of the vehicle, and an arbitrary point coordinate matrix X of the arbitrary point k, which is expressed by the following equation (7), is obtained F_k The method comprises the following steps:
Figure BDA0003148047350000156
the pair of corresponding points of the arbitrary point k in the shared SLAM map can be determined by the following equation (8)Response point matrix X A_k
Figure BDA0003148047350000157
In the above equation, θ F_k May represent the angle of said arbitrary point k with respect to a first x-y coordinate plane of a first coordinate system F.
x A_k And y A_k Respectively representing the abscissa and the ordinate theta of the corresponding point k in the shared SLAM map in the second x-y coordinate plane of the second coordinate system A A_k May represent the angle of the corresponding point of the arbitrary point k in the shared SLAM map relative to the second x-y coordinate plane. In other words, the coordinate of the corresponding point of the arbitrary point k detected by the vehicle in the shared SLAM map can be determined to be (x) by equation (8) above A_k ,y A_k ,θ A_k )。
According to the method for a vehicle using a shared SLAM map of the present invention, it is possible to efficiently and accurately determine position conversion information capable of representing a detected difference between the host vehicle and the creating vehicle by position information of the same point on a travel path respectively detected by the host vehicle and the creating vehicle and identification information on the host vehicle, so that the host vehicle can accurately use the shared SLAM map created by other creating vehicles.
Fig. 4 shows a block diagram of an apparatus for a vehicle using a shared SLAM map according to an exemplary embodiment of the present invention.
Referring to fig. 4, the apparatus for using a shared SLAM map for a vehicle according to the present invention includes: a first acquisition unit 1, a second acquisition unit 2 and a determination unit 3.
The first acquisition unit 1 is configured to be able to acquire first position information of at least two points on a predetermined travel path of the vehicle with respect to the vehicle and second position information with respect to a creating vehicle for creating a shared SLAM map.
The second obtaining unit 2 is configured to be able to obtain third position information of at least one logo on the vehicle relative to the vehicle and fourth position information relative to the creating vehicle.
The determination unit 3 is configured to be able to determine position conversion information for converting an arbitrary point detected by the vehicle on the predetermined travel path into a corresponding point in the shared SLAM map, from the first position information, the second position information, the third position information, and the fourth position information.
The determination of the first position information, the second position information, the third position information, the fourth position information, and the position conversion information has been described in detail above with reference to fig. 1 and 2, and is not described again here.
According to the apparatus for a vehicle using a shared SLAM map of the present invention, it is possible to efficiently and accurately determine position conversion information capable of representing a detected difference between the own vehicle and the creating vehicle by position information of the same point on a travel path detected by the own vehicle and the creating vehicle separately and identification information on the own vehicle, so that the own vehicle can accurately use the shared SLAM map created by other creating vehicles.
There is also provided in accordance with an exemplary embodiment of the invention a computer program product, wherein the computer program product comprises a computer program which, when executed by a processor, causes the processor to implement the method for using a shared SLAM map for a vehicle according to the invention. The computer program product may include a computer program, program code, instructions, or some combination thereof for instructing or configuring hardware devices, individually or collectively, to operate as desired. The computer program and/or program code can include a program or computer-readable instructions, software components, software modules, data files, data structures, etc., that can be implemented by one or more hardware devices. Examples of program code may include machine code, such as produced by a compiler, and higher level program code, such as executed using an interpreter.
There is also provided in accordance with an exemplary embodiment of the present invention a computer readable recording medium storing a computer program configured to, when executed by a processor, implement a method for using a shared SLAM map for a vehicle in accordance with the present invention. The computer readable recording medium is any data storage device that can store data read by a computer system. Examples of the computer-readable recording medium include: read-only memory, random access memory, read-only optical disks, magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the internet via wired or wireless transmission paths). The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. Also, functional programs, codes, and code segments for accomplishing the present invention can be easily construed by programmers of ordinary skill in the art to which the present invention pertains within the scope of the present invention.
Furthermore, each unit in the above-described apparatuses and devices according to exemplary embodiments of the present invention may be implemented as a hardware component or a software module. Further, the respective units may be implemented by using, for example, a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), or a processor according to the processing performed by the respective units defined by those skilled in the art.
Although the invention is illustrated and described herein with reference to specific embodiments, the invention is not intended to be limited to the details shown. Rather, various modifications may be made in the details within the scope of the invention.
List of reference numerals
S1, acquiring first position information of at least two points on a preset running path of the vehicle relative to the vehicle and second position information of at least one mark on the vehicle relative to the vehicle and a creating vehicle for creating a shared SLAM map S2, and acquiring third position information of at least one mark on the vehicle relative to the vehicle and fourth position information of at least one mark on the vehicle relative to the creating vehicle
S3, determining position conversion information according to the first position information, the second position information, the third position information and the fourth position information
S31, determining first position conversion information according to the first sub-position information, the third sub-position information, the fifth sub-position information and the seventh sub-position information
S32 determining second position conversion information according to the second sub-position information, the fourth sub-position information, the sixth sub-position information and the eighth sub-position information
S33, determining the position conversion information according to the first sub-position information, the second sub-position information, the first position conversion information and the second position conversion information
M 1 、M 2 、M 3 、M 4 Identification
1. First acquisition unit
2. First acquisition unit
3. Determining unit

Claims (17)

1. A method for use of a shared SLAM map for a vehicle, the method comprising:
acquiring first position information of at least two points on a predetermined travel path of the vehicle relative to the vehicle and second position information relative to a creating vehicle for creating a shared SLAM map;
obtaining third position information of at least one mark on the vehicle relative to the vehicle and fourth position information relative to a creating vehicle;
and determining position conversion information according to the first position information, the second position information, the third position information and the fourth position information, wherein the position conversion information is used for converting any point detected by the vehicle on the preset running path into a corresponding point in the shared SLAM map.
2. The method according to claim 1, wherein first position information of the at least two points at which the vehicle is detected at the at least two detection positions, respectively, and second position information of the at least two points at which the creation vehicle is detected at the at least two detection positions, respectively, are acquired by synchronizing the vehicle and the creation vehicle traveling on the predetermined travel path.
3. The method according to claim 2, wherein the fourth position information of at least one marker on the vehicle detected by the creating vehicle when the vehicle detects the at least two points, respectively, is acquired by creating a vehicle;
obtaining third location information of the at least one identity via an identity database and/or third location information of the at least one identity detected by the vehicle via the vehicle,
wherein the database of identifiers comprises respective identifiers on the vehicle and positional information of each identifier relative to the vehicle.
4. The method of claim 3, wherein the at least two points comprise a first point and a second point,
the first location information includes: first sub-positional information of a first point relative to the vehicle and second sub-positional information of a second point relative to the vehicle,
the second location information includes: third sub-position information of the first point relative to the creating vehicle and fourth sub-position information of the second point relative to the creating vehicle.
5. The method of claim 4, wherein the at least one identifier comprises a first identifier and a second identifier, wherein the first identifier and the second identifier represent one identifier on the vehicle detected by the creating vehicle corresponding to the first point and the second point, respectively,
wherein the third location information includes: fifth sub-position information of the first marker relative to the vehicle and sixth sub-position information of the second marker relative to the vehicle;
the fourth location information includes: seventh sub-position information of the first marker relative to the creating vehicle and eighth sub-position information of the second marker relative to the creating vehicle.
6. The method of claim 5, wherein the vehicle includes a single identification,
wherein the first and second identifiers represent the identifiers on the vehicle detected by the creating vehicle corresponding to the first and second points, respectively.
7. The method of claim 6, wherein the vehicle includes a plurality of identifiers,
wherein the first identifier and the second identifier respectively represent one identifier which is closest to the creating vehicle in two or more identifiers detected by the creating vehicle corresponding to the first point and the second point on the vehicle.
8. The method of claim 7, wherein each of the plurality of identities has a respective identification code,
the method further comprises the following steps: acquiring an identification code of a first identification and an identification code of a second identification that create a vehicle detection,
wherein, according to the identification code of the first identification and the identification code of the second identification, the fifth sub-position information and the sixth sub-position information are acquired through an identification database, and/or the fifth sub-position information and the sixth sub-position information detected by the vehicle are acquired through the vehicle,
wherein the identification database includes respective identifications, identification codes corresponding to each identification, and position information of each identification with respect to the vehicle.
9. The method according to any one of claims 5 to 8, wherein the predetermined travel path is a straight travel path,
wherein, the step of determining the position conversion information according to the first position information, the second position information, the third position information and the fourth position information comprises:
determining first position conversion information according to the first sub-position information, the third sub-position information, the fifth sub-position information and the seventh sub-position information;
determining second position conversion information according to the second sub-position information, the fourth sub-position information, the sixth sub-position information and the eighth sub-position information;
and determining the position conversion information according to the first sub-position information, the second sub-position information, the first position conversion information and the second position conversion information.
10. The method of claim 9, wherein,
the first sub-position information is a first coordinate matrix of the first point relative to a first coordinate system,
the second sub-position information is a second coordinate matrix of the second point relative to the first coordinate system,
the third sub-position information is a third coordinate matrix of the first point relative to the second coordinate system,
the fourth sub-position information is a fourth coordinate matrix of the second point relative to the second coordinate system,
the fifth sub-position information is a first identification transformation matrix from the first identification to the origin of the first coordinate system,
the sixth sub-position information is a second identification transformation matrix from the second identification to the origin of the first coordinate system,
the seventh sub-position information is a fifth coordinate matrix of the first identifier relative to the second coordinate system,
the eighth sub-position information is a sixth coordinate matrix of the second identifier relative to the second coordinate system,
wherein the first coordinate system is a detection coordinate system of the vehicle and the second coordinate system is a detection coordinate system for creating the vehicle.
11. The method of claim 10, wherein the coordinate matrix X is represented by the following equation F/A =X(X F/A ,y F/A ,θ F/A ) To represent a first coordinate matrix X F_i =X(x F_i ,y F_i ,θ F_i ) A second coordinate matrix X F_j =X(x F_j ,y F_j ,θ F_j ) A third coordinate matrix X A_i =X(x A_i ,y A_i ,θ A_i ) Fourth coordinate matrix X A_j =X(x A_j ,y A_j ,θ A_j ) The fifth coordinate matrix
Figure FDA0003148047340000031
And a sixth coordinate matrix
Figure FDA0003148047340000041
Figure FDA0003148047340000042
Wherein x is F/A For representing x F_i 、x F_j 、x A_i 、x A_j
Figure FDA0003148047340000043
Or
Figure FDA0003148047340000044
y F/A For representing the corresponding y F_i 、y F_j 、y A_i 、x A_j
Figure FDA0003148047340000045
Or
Figure FDA0003148047340000046
θ F/A For representing the corresponding theta F_i 、θ F_j 、θ A_i 、θ A_j
Figure FDA0003148047340000047
Or
Figure FDA0003148047340000048
Wherein x is F_i And y F_i Respectively representing the abscissa and ordinate, theta, of a first point i in a first x-y coordinate plane of a first coordinate system F F_i Representing the angle of the first point i with respect to the first x-y coordinate plane,
x F_j and y F_j Respectively representing the abscissa and the ordinate, theta, of the second point j in the first x-y coordinate plane F_j Representing the angle of the second point j with respect to the first x-y coordinate plane,
x A_i and y A_i Respectively representing the abscissa and ordinate, theta, of the first point i in a second x-y coordinate plane of a second coordinate system A A_i Representing the angle of the first point i with respect to the second x-y coordinate plane,
x A_j and y A_j Respectively representing the abscissa and the ordinate, theta, of a second point j in a second x-y coordinate plane A_j Representing the angle of the second point j with respect to the second x-y coordinate plane,
Figure FDA0003148047340000049
and
Figure FDA00031480473400000410
respectively represent first marks M n On the abscissa and ordinate of the second x-y coordinate plane,
Figure FDA00031480473400000411
represents a first mark M n The angle relative to the second x-y coordinate plane,
Figure FDA00031480473400000412
and
Figure FDA00031480473400000413
respectively represent a second mark M p On the abscissa and ordinate of the second x-y coordinate plane,
Figure FDA00031480473400000414
represents a second mark M p Angle relative to a second x-y coordinate plane.
12. The method of claim 11, wherein the first identity translation matrix is represented by
Figure FDA00031480473400000415
Figure FDA0003148047340000051
Wherein the second identifier conversion matrix is represented by
Figure FDA0003148047340000052
Figure FDA0003148047340000053
Wherein the content of the first and second substances,
Figure FDA0003148047340000054
and
Figure FDA0003148047340000055
respectively represent first marks M n On the abscissa and ordinate of a first x-y coordinate plane of the first coordinate system F,
Figure FDA0003148047340000056
represents a first mark M n The angle relative to the first x-y coordinate plane,
Figure FDA0003148047340000057
and
Figure FDA0003148047340000058
respectively represent a second mark M p On the abscissa and ordinate of the first x-y coordinate plane,
Figure FDA0003148047340000059
represents a second mark M p Angle relative to the first x-y coordinate plane.
13. The method of claim 12, whichWherein the first position conversion information is determined by the following equation
Figure FDA00031480473400000510
Figure FDA00031480473400000511
Wherein the second position conversion information is determined by the following equation
Figure FDA00031480473400000512
Figure FDA00031480473400000513
14. The method of claim 13, wherein the position transition information is determined by the following equation
Figure FDA00031480473400000514
Figure FDA00031480473400000515
Wherein the content of the first and second substances,
Figure FDA0003148047340000061
Figure FDA0003148047340000062
Figure FDA0003148047340000063
wherein the content of the first and second substances,
Figure FDA0003148047340000064
Figure FDA0003148047340000065
wherein x is F_k And y F_k Respectively representing the abscissa and ordinate of an arbitrary point k detected by the vehicle on the predetermined travel path on a first x-y coordinate plane of a first coordinate system F.
15. The method according to claim 14, wherein the arbitrary point k is detected on the predetermined travel path of the vehicle, and an arbitrary point coordinate matrix X of the arbitrary point k is obtained, which is expressed by the following equation F_k The method comprises the following steps:
Figure FDA0003148047340000066
determining a corresponding point matrix X of the corresponding points of the arbitrary point k in the shared SLAM map by the following equation A_k
Figure FDA0003148047340000067
Wherein, theta F_k Representing the angle of said arbitrary point k with respect to a first x-y coordinate plane of a first coordinate system F,
wherein x is A_k And y A_k Respectively representing the abscissa and the ordinate of the corresponding point k in the shared SLAM map on a second x-y coordinate plane of a second coordinate system A, theta A_k Representing the angle of the corresponding point of said arbitrary point k in the shared SLAM map with respect to the second x-y coordinate plane.
16. An apparatus for sharing a SLAM map for use with a vehicle, the apparatus comprising:
a first acquisition unit configured to be able to acquire first position information with respect to the vehicle and second position information with respect to a creating vehicle for creating a shared SLAM map of at least two points on a predetermined travel path of the vehicle;
a second acquisition unit configured to be able to acquire third position information of at least one logo on the vehicle with respect to the vehicle and fourth position information with respect to a creation vehicle;
a determination unit configured to be able to determine position conversion information for converting an arbitrary point detected by the vehicle on the predetermined travel path into a corresponding point in a shared SLAM map, from first position information, second position information, third position information, and fourth position information.
17. A computer program product, wherein the computer program product comprises a computer program that, when executed by a processor, causes the processor to carry out the method of any one of claims 1 to 15.
CN202110759490.3A 2021-07-05 2021-07-05 Method and apparatus for using shared SLAM map for vehicle Pending CN115587151A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110759490.3A CN115587151A (en) 2021-07-05 2021-07-05 Method and apparatus for using shared SLAM map for vehicle
PCT/CN2022/094885 WO2023279878A1 (en) 2021-07-05 2022-05-25 Shared slam map using method and device for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110759490.3A CN115587151A (en) 2021-07-05 2021-07-05 Method and apparatus for using shared SLAM map for vehicle

Publications (1)

Publication Number Publication Date
CN115587151A true CN115587151A (en) 2023-01-10

Family

ID=84771757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110759490.3A Pending CN115587151A (en) 2021-07-05 2021-07-05 Method and apparatus for using shared SLAM map for vehicle

Country Status (2)

Country Link
CN (1) CN115587151A (en)
WO (1) WO2023279878A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117906615A (en) * 2024-03-15 2024-04-19 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103968822B (en) * 2013-01-24 2018-04-13 腾讯科技(深圳)有限公司 Air navigation aid, equipment and navigation system for navigation
CN109641538A (en) * 2016-07-21 2019-04-16 国际智能技术公司 It is created using vehicle, updates the system and method for map
JP2018136240A (en) * 2017-02-23 2018-08-30 三菱電機株式会社 Estimation device, method for estimation, tracking device having estimation device, and method for tracking having method for estimation
CN109857111B (en) * 2019-02-18 2020-11-13 广州小鹏汽车科技有限公司 High-precision positioning method and system based on shared SLAM map

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117906615A (en) * 2024-03-15 2024-04-19 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code
CN117906615B (en) * 2024-03-15 2024-06-04 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code

Also Published As

Publication number Publication date
WO2023279878A1 (en) 2023-01-12

Similar Documents

Publication Publication Date Title
CN104977012A (en) Method and system for determining position of vehicle
US11475591B2 (en) Hybrid metric-topological camera-based localization
CN107218927B (en) A kind of cargo pallet detection system and method based on TOF camera
CN108363386A (en) Position Method for Indoor Robot, apparatus and system based on Quick Response Code and laser
US11010919B2 (en) Object locator with fiducial marker
CN106650873A (en) Identification code, and automatic guiding vehicle rapid navigation method and system
CN103782247A (en) Method and apparatus for using pre-positioned objects to localize an industrial vehicle
CN109459032B (en) Mobile robot positioning method, navigation method and grid map establishing method
CN110789529B (en) Vehicle control method, device and computer-readable storage medium
WO2022000197A1 (en) Flight operation method, unmanned aerial vehicle, and storage medium
CN110083668B (en) Data management system, management method, terminal and storage medium for high-precision map
CN108534789B (en) Multipath positioning coordinate unifying method, electronic equipment and readable storage medium
CN105243366A (en) Two-dimensional code based vehicle positioning method
CN111681172A (en) Method, equipment and system for cooperatively constructing point cloud map
CN108876857A (en) Localization method, system, equipment and the storage medium of automatic driving vehicle
CN113447936A (en) AGV intelligent forklift and method and device for detecting platform state of ground pile inventory area
US20200124695A1 (en) Positioning method, apparatus and system, layout method of positioning system, and storage medium
CN115587151A (en) Method and apparatus for using shared SLAM map for vehicle
CN113379011A (en) Pose correction method, device, equipment and storage medium
Tas et al. High-definition map update framework for intelligent autonomous transfer vehicles
US11417015B2 (en) Decentralized location determination systems and methods
CN108981705B (en) Positioning reference device
Thompson et al. Vision-based navigation
CN112069849A (en) Identification and positioning method, device, equipment and storage medium based on multiple two-dimensional codes
US20210405197A1 (en) GLOBAL LOCALIZATION APPARATUS AND METHOD IN DYNAMIC ENVIRONMENTS USING 3D LiDAR SCANNER

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination