CN108875689B - Unmanned vehicle alignment method, system, equipment and storage medium - Google Patents

Unmanned vehicle alignment method, system, equipment and storage medium Download PDF

Info

Publication number
CN108875689B
CN108875689B CN201810708615.8A CN201810708615A CN108875689B CN 108875689 B CN108875689 B CN 108875689B CN 201810708615 A CN201810708615 A CN 201810708615A CN 108875689 B CN108875689 B CN 108875689B
Authority
CN
China
Prior art keywords
dimensional code
unmanned vehicle
code pattern
image
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810708615.8A
Other languages
Chinese (zh)
Other versions
CN108875689A (en
Inventor
张波
吕灏
刘吉川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xijing Technology Co ltd
Original Assignee
Shanghai Westwell Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Westwell Information Technology Co Ltd filed Critical Shanghai Westwell Information Technology Co Ltd
Priority to CN201810708615.8A priority Critical patent/CN108875689B/en
Publication of CN108875689A publication Critical patent/CN108875689A/en
Application granted granted Critical
Publication of CN108875689B publication Critical patent/CN108875689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/586Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a method, a system, equipment and a storage medium for aligning an unmanned vehicle, wherein the method comprises the following steps: shooting an image of a front square two-dimensional code pattern by using a camera; establishing a coordinate system according to the image, establishing a first matrix A according to the coordinates of the four corners of the pattern, setting a second matrix B in a overlooking state, and obtaining a transformation matrix M; establishing a mapping relation table of each two-dimensional code pattern and positioning coordinate information on a driving line; shooting a current image of the driving direction, and acquiring the two-dimension code pattern to be deformed into a square two-dimension code pattern; if the current image contains a preset two-dimensional code pattern, establishing a coordinate system according to the current image, establishing a current matrix C at four corners of the pattern, obtaining a overlooking matrix D (C M), and calculating the distance between the unmanned vehicle and the two-dimensional code pattern; and calculating the current coordinates of the unmanned vehicle so as to control the vehicle. The invention can accurately drive to the target coordinate in an automatic mode, thereby improving the overall operation efficiency.

Description

Unmanned vehicle alignment method, system, equipment and storage medium
Technical Field
The invention relates to the field of unmanned vehicle control, in particular to an unmanned vehicle alignment method, system, equipment and storage medium.
Background
Along with the development of economy and the improvement of shipping requirements, the traditional wharf increasingly faces huge manpower gaps, low operation efficiency and the like. The rise of the unmanned technology is expected to help the port solve the pain point by using an automatic scheme, and the operation efficiency of the wharf is further improved. The conventional wharf finishes the dispatching and transportation of containers through bridge cranes, gantry cranes and other equipment, and transfers the containers through a container truck. When a port is automatically modified, the problem of interaction between the trucks and the distribution equipment is inevitably encountered. At present, equipment such as bridge cranes, gantry cranes and the like realize the grabbing of containers through mechanical grippers, and the mechanical grippers can only realize the operation with one degree of freedom, so that a truck needs to be accurately aligned to the position, and the grippers can smoothly pass through lock holes on the containers to realize grabbing. The traditional manual alignment mode needs the truck driver to adjust as if to realize alignment, and the time consumption is long, and the operating efficiency is influenced.
Accordingly, the present invention provides a method, system, device and storage medium for unmanned vehicle alignment.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a method, a system, equipment and a storage medium for aligning an unmanned vehicle, which can accurately drive to a target coordinate in an automatic mode and realize high-precision alignment of a truck, a bridge crane and a gantry crane in an automatic port, thereby saving time and improving the overall operation efficiency.
The embodiment of the invention provides an unmanned vehicle alignment method, which comprises the following steps:
s101, shooting and shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of a driving direction by adopting a fixed camera of an unmanned vehicle; establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, and the coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid due to perspective deformation are respectively (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4);
Setting a first matrix
Figure BDA0001716161920000021
Second matrix
Figure BDA0001716161920000022
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB;
S102, arranging a plurality of square two-dimensional code patterns on a driving line, and establishing a mapping relation table of each two-dimensional code pattern and positioning coordinate information of the two-dimensional code pattern;
s103, shooting a current image of the driving direction by using a camera of the unmanned vehicle, collecting a two-dimensional code pattern in the current image, and stretching and deforming the two-dimensional code pattern in the image into a square two-dimensional code pattern;
s104, judging whether the two-dimensional code pattern in the current image hits one of the mapping relation tables, if so, executing the step S105, and if not, returning to the step S103;
s105, establishing a coordinate system according to the current image, wherein the coordinates of the four corners of the square two-dimensional code pattern which is changed into the trapezoid due to perspective deformation in the current image are respectively (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure BDA0001716161920000023
Obtaining a look-down matrix
Figure BDA0001716161920000024
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure BDA0001716161920000025
s106, obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained in the step S105 and the positioning coordinate information of the square two-dimensional code pattern in the current image;
s107, judging whether the unmanned vehicle reaches the target coordinates, if so, executing a step S109, and if not, executing a step S108;
s108, comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator braking or brake braking; and
and S109, ending.
Preferably, the step S108 includes
S1081, collecting the current speed V of the unmanned vehiclet
S1082, calculating a target acceleration
Figure BDA0001716161920000031
S1083, calculating the current actual acceleration of the vehicle by using a difference method, wherein VtIs the current vehicle speed, Vt-1The time interval between two sampling data is Δ t:
Figure BDA0001716161920000032
s1084, calculating the current target acceleration anAnd the actual acceleration arThe difference between them;
s1085, calculating the current required accelerator and brake values:
Figure BDA0001716161920000033
kp、ki、kdthe value ranges of (a), (b), (c) and (d) are (0, 1), and when u (t) is greater than zero, u (t) is an accelerator value; when u (t) is less than zero, u (t) is the brake valve value.
Preferably, in step S1085, k isp=0.6,ki=0.1,kd=0.3。
Preferably, in step S101, it is assumed that each pixel of the image corresponds to 1cm, and coordinates of four corners of the square two-dimensional code pattern in a plan view state are (-0.5L, L), (0.5L, 2L), and (-0.5L, 2L), respectively;
then a second matrix corresponding to the top view state is established
Figure BDA0001716161920000034
Preferably, in step S105, the current matrix C is converted into a top-view matrix D in a top-view state by multiplying the current matrix by a transformation matrix M.
Preferably, the origin of the coordinate system of the rectangular image in step S101 is the lower left corner of the image, and is an extending direction of the X axis from the origin to the right, and is an extending direction of the Y axis from the origin to the top;
the origin of the coordinate system in step 105 is the lower left corner of the current image, and is the extending direction of the X axis from the origin to the right, and is the extending direction of the Y axis from the origin to the top.
Preferably, all the square two-dimensional code patterns in step S101 have the same size.
Preferably, the order of step S101 and step S102 is exchanged.
Preferably, the resolution of the camera is 1024 × 768.
The embodiment of the present invention further provides an alignment system for an unmanned vehicle, which is used for implementing the above alignment method for an unmanned vehicle, and the alignment system for an unmanned vehicle includes:
the transformation matrix obtaining module is used for shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of the driving direction by adopting a fixed camera of the unmanned vehicle; establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, and the coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid due to perspective deformation are respectively (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4);
Setting a first matrix
Figure BDA0001716161920000041
Second matrix
Figure BDA0001716161920000042
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB;
The mapping relation establishing module is used for setting a plurality of square two-dimensional code patterns on a driving line and establishing a mapping relation table of each two-dimensional code pattern and the positioning coordinate information of the two-dimensional code pattern;
the image stretching deformation module is used for shooting a current image in the driving direction by using a camera of an unmanned vehicle, acquiring a two-dimensional code pattern in the current image and stretching and deforming the two-dimensional code pattern in the image into a square two-dimensional code pattern;
the first judgment module is used for judging whether the two-dimensional code pattern in the current image hits one of the mapping relation tables, if so, the distance acquisition module is executed, and if not, the image stretching deformation enabling module is returned;
a distance obtaining module for establishing a coordinate system according to the current image, wherein the coordinates of the four corners of the square two-dimensional code pattern changed into trapezoid due to perspective deformation in the current image are respectively (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure BDA0001716161920000043
Obtaining a look-down matrix
Figure BDA0001716161920000044
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure BDA0001716161920000045
the coordinate positioning module is used for obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained by the distance obtaining module and the positioning coordinate information of the square two-dimensional code pattern in the current image;
the first judgment module is used for judging whether the unmanned vehicle reaches the target coordinate or not, and if not, the driving output module is executed;
and the driving output module is used for comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator or brake.
An embodiment of the present invention also provides an unmanned vehicle alignment apparatus, including:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the above-described unmanned vehicle alignment method via execution of the executable instructions.
Embodiments of the present invention also provide a computer-readable storage medium for storing a program, which when executed, implements the steps of the above-mentioned unmanned vehicle alignment method.
The invention aims to provide a method, a system, equipment and a storage medium for aligning an unmanned vehicle, which can accurately drive to a target coordinate in an automatic mode and realize high-precision alignment of a truck, a bridge crane and a gantry crane at an automatic port, thereby saving time and improving the overall operation efficiency.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
FIG. 1 is a flow chart of the unmanned vehicle alignment method of the present invention;
FIGS. 2 through 7 are schematic diagrams of an unmanned vehicle alignment method implementation of the present invention;
FIG. 8 is a block schematic diagram of the unmanned vehicle alignment system of the present invention;
FIG. 9 is a schematic structural view of the unmanned vehicle alignment apparatus of the present invention; and
fig. 10 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
FIG. 1 is a flow chart of the unmanned vehicle alignment method of the present invention. As shown in fig. 1, an embodiment of the present invention provides an unmanned vehicle alignment method, including the steps of:
s101, shooting and shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of the driving direction by adopting a fixed camera of the unmanned vehicle. In this embodiment, the resolution of the camera is 1024 × 768, but not limited thereto. And establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, the origin of the rectangular image is the lower left corner of the image, the extending direction of the X axis is from the origin to the right, and the extending direction of the Y axis is from the origin to the top. The coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid by perspective are (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4);
Setting a first matrix
Figure BDA0001716161920000061
Let each pixel of the image correspond to 1cm, and coordinates of four corners of the square two-dimensional code pattern in a top view state are (-0.5L, L), (0.5L, 2L) and (-0.5L, 2L), respectively;
then a second matrix corresponding to the top view state is established
Figure BDA0001716161920000062
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB。
S102, a plurality of square two-dimensional code patterns are arranged on the driving line, and the sizes of all the square two-dimensional code patterns are the same. And establishing a mapping relation table of each two-dimension code pattern and the positioning coordinate information of the two-dimension code pattern. In a preferred embodiment, the order of step S101 and step S102 is exchanged, but not limited thereto.
S103, shooting a current image of the driving direction by using a camera of the unmanned vehicle, collecting a two-dimensional code pattern in the current image, and stretching and deforming the two-dimensional code pattern in the image into a square two-dimensional code pattern.
And S104, judging whether the two-dimensional code pattern in the current image hits one of the mapping relation tables, if so, executing the step S105, and if not, returning to the step S103.
And S105, establishing a coordinate system according to the current image, wherein the origin of the coordinate system is the lower left corner of the current image, the extending direction of the X axis is from the origin to the right, and the extending direction of the Y axis is from the origin to the top. The coordinates of the four corners of the square two-dimensional code pattern which changes into a trapezoid due to perspective transformation in the current image are (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure BDA0001716161920000071
The current matrix C is converted into a look-down matrix D in a look-down state by multiplication with the transformation matrix M.
Obtaining a look-down matrix
Figure BDA0001716161920000072
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure BDA0001716161920000073
and S106, obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained in the step S105 and the positioning coordinate information of the square two-dimensional code pattern in the current image.
And S107, judging whether the unmanned vehicle reaches the target coordinate, if so, executing step S109, and if not, executing step S108.
And S108, comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator braking or brake braking. In this embodiment, step S108 includes the following steps:
s1081, collecting current speed V of unmanned vehiclet
S1082, calculating a target acceleration
Figure BDA0001716161920000074
S1083, calculating the current actual acceleration of the vehicle by using a difference method, wherein VtIs the current vehicle speed, Vt-1The time interval between two sampling data is Δ t:
Figure BDA0001716161920000075
s1084, calculating the current target acceleration anAnd the actual acceleration arThe difference between them.
S1085, calculating the current required accelerator and brake values:
Figure BDA0001716161920000081
kp、ki、kdthe value ranges of (a), (b), (c) and (d) are (0, 1), and when u (t) is greater than zero, u (t) is an accelerator value; when u (t) is less than zero, u (t) is the brake valve value. In a preferred embodiment, in step S1085, k isp=0.6,ki=0.1,kdBut not limited thereto, 0.3.
And S109, ending.
The invention aims to provide an alignment method of an unmanned vehicle, which can accurately drive to a target coordinate in an automatic mode and realize high-precision alignment of a truck, a bridge crane and a gantry crane at an automatic port, thereby saving time and improving the overall operation efficiency.
Fig. 2 to 7 are schematic diagrams of the unmanned vehicle alignment method according to the present invention. The implementation of the present invention is specifically described by fig. 2 to 7.
As shown in fig. 2, square two-dimensional code patterns 2 having a side length of 200cm are provided on a travel route, and all the square two-dimensional code patterns 2 have the same size. And establishing a mapping relation table of each two-dimension code pattern and the positioning coordinate information of the two-dimension code pattern. The front part of the unmanned vehicle 1 is provided with a fixed camera 11, and in order to ensure accurate determination of image deformation, the camera 11 is preferably non-rotatable and fixed in focus, and intelligently shoots an image in a preset range in front of the unmanned vehicle 1.
As shown in fig. 3, the square two-dimensional code pattern 2 is spaced 200cm from the camera 11. Although the two-dimensional code pattern is actually square, the two-dimensional code pattern 2 in the image is deformed in the image shot by the camera 11, and becomes trapezoidal, one long side of the trapezoid is formed close to the camera 11, one short side of the trapezoid is formed far away from the camera 11, and the two sides of the trapezoid are respectively inclined to form two inclined sides of the trapezoid.
As shown in fig. 4, the resolution of the camera 11 is 1024 × 768, a coordinate system is created from the images, the lower left corner of the image 31 with the resolution of 1024 × 768 is set as the origin, the extending direction from the origin to the right is the X-axis extending direction, and the extending direction from the origin to the top is the Y-axis extending direction. Assuming that each pixel of the image corresponds to 1cm, coordinates (X) of four corners of a square two-dimensional code pattern 32 which changes into a trapezoid due to perspective are measured1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4) Respectively, (446, 240), (982, 235), (857, 511) and (610, 513).
Setting a first matrix
Figure BDA0001716161920000091
As shown in fig. 5, the coordinates of the four corners of the square two-dimensional code pattern 34 in the image 33 in the plan view state are (-100, 200), (100, 400), and (-100, 400), respectively, the side length of the square two-dimensional code pattern is 200 pixels, and the side length of the square two-dimensional code pattern 2 is 200cm, so that the distance from the center point of the square two-dimensional code pattern 34 to the origin O is 200 pixels, which is equal to the value of 200cm between the unmanned vehicle 1 and the two-dimensional code pattern in the current image.
Then a second matrix corresponding to the top view state is established
Figure BDA0001716161920000092
A transformation matrix M is obtained from the first matrix a and the second matrix B,
Figure BDA0001716161920000093
the obtained transformation matrix M can be used to stretch-deform a square two-dimensional code pattern 32 that is trapezoidal in shape due to perspective deformation in an image 31 to a square two-dimensional code pattern 34 in an image 33
As shown in fig. 6, the unmanned vehicle travels on a travel route provided with the square two-dimensional code pattern 2. The camera 11 of the unmanned vehicle 1 is used to capture a current image of the driving direction, the two-dimensional code pattern 42 in the current image 41 is captured, and the two-dimensional code pattern 42 in the image 41 is stretched and deformed into a square two-dimensional code pattern 44.
And judging whether the two-dimensional code pattern 44 in the current image 41 hits one of the mapping relation tables, if so, establishing a coordinate system according to the current image, wherein the origin of the coordinate system is the lower left corner of the current image, the extending direction of the X axis is from the origin to the right, and the extending direction of the Y axis is from the origin to the top. The coordinates of the four corners of the square two-dimensional code pattern 2 changed into a trapezoid by perspective in the current image are (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure BDA0001716161920000101
The current matrix C is converted into a look-down matrix D in a look-down state by multiplication with the transformation matrix M.
Obtaining a look-down matrix
Figure BDA0001716161920000102
As shown in fig. 7, the coordinates of the two-dimensional code pattern 44 in the image 41 can be obtained as (X) respectively9,Y9)、(X10,Y10)、(X11,Y11) And (X)12,Y12). The coordinates of the center of the two-dimensional code pattern 44 are
Figure BDA0001716161920000103
Center E of two-dimensional code pattern 44 in image 41 to originThe distance OE of O is equal to the numerical value of the distance Scm between the unmanned vehicle 1 and the two-dimensional code pattern in the current image;
Figure BDA0001716161920000104
the calculation process is not described in detail.
And obtaining the current coordinate of the unmanned vehicle 1 according to the distance Scm and the positioning coordinate information of the square two-dimensional code pattern 2 in the current image. For example: the unmanned vehicle 1 travels on the travel route, the positioning coordinate information (P, Q) of the square two-dimensional code pattern 2 of the travel route, so that the unmanned vehicle 1 is located on the travel route at a distance Scm from the positioning coordinate (P, Q) at this time.
After accurate positioning information is available, the vehicle can be accurately controlled. Judging whether the unmanned vehicle 1 reaches the target coordinate, if so, comparing the current coordinate of the unmanned vehicle 1 with the target coordinate to perform accelerator or brake, and the method can comprise the following steps:
acquiring the current vehicle speed V of the unmanned vehicle 1t(ii) a Calculating a target acceleration
Figure BDA0001716161920000105
Calculating the current actual acceleration of the vehicle by using a difference method, wherein VtIs the current vehicle speed, Vt-1The time interval between two sampling data is Δ t:
Figure BDA0001716161920000106
calculating the current target acceleration anAnd the actual acceleration arThe difference between them; calculating the current required throttle and brake values:
Figure BDA0001716161920000107
kp、ki、kdthe value ranges of (a), (b), (c) and (d) are (0, 1), and when u (t) is greater than zero, u (t) is an accelerator value; when u (t) is less than zero, u (t) is the brake valve value. In a preferred embodiment, in step S1085, k isp=0.6,ki=0.1,kdBut not limited thereto, 0.3.
In a preferred embodiment, after the above procedure, the automatic truck can travel to the vicinity of the alignment point, but there is still a certain error, and there is a certain probability that the high accuracy requirement of alignment cannot be satisfied, which is caused by the error of the vehicle's own travel mechanism. At the moment, the card concentrator automatically starts a fine alignment mode, and in the fine alignment mode, the card concentrator can automatically judge whether the forward movement or the reverse movement operation is needed. At this time, the controller will give a constant throttle value G to change the wheel from a stationary state to a rolling state when the speed of the truck exceeds a small threshold value VLIn time, the controller will give a constant brake value B, causing the truck to stop quickly. The principle is that the low-speed control of the vehicle is realized through the short moment when the container truck is changed from a static state to a moving state, and the final high-precision control can be completed in such a way. The constant throttle value and the constant braking value can be obtained by actual tests. This process is repeated until the hub is able to stop within a specified distance. The invention can provide the alignment method of the unmanned vehicle 1, which can accurately drive to the target coordinate in an automatic mode and realize high-precision alignment of the truck, the bridge crane and the gantry crane at an automatic port, thereby saving time and improving the overall operation efficiency.
Fig. 8 is a block schematic diagram of the unmanned vehicle alignment system of the present invention. As shown in fig. 8, an embodiment of the present invention further provides an unmanned vehicle alignment system, which is used for implementing the above unmanned vehicle alignment method, and the unmanned vehicle alignment system 9 includes:
the transformation matrix obtaining module 91 is used for shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of the driving direction by adopting a fixed camera of the unmanned vehicle; establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, and the coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid due to perspective deformation are respectively (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4)。
Setting a first matrix
Figure BDA0001716161920000111
Second matrix
Figure BDA0001716161920000112
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB。
The mapping relationship establishing module 92 is configured to set a plurality of square two-dimensional code patterns on the driving line, and establish a mapping relationship table between each two-dimensional code pattern and the positioning coordinate information of the two-dimensional code pattern.
The image stretching deformation module 93 shoots a current image in the driving direction by using a camera of the unmanned vehicle, collects a two-dimensional code pattern in the current image, and stretches and deforms the two-dimensional code pattern in the image into a square two-dimensional code pattern.
The first determining module 94 determines whether the two-dimensional code pattern in the current image hits one of the mapping relationship tables, if so, executes the distance obtaining module, and if not, returns to the image stretching deformation enabling module.
The distance obtaining module 95 establishes a coordinate system according to the current image, and coordinates of four corners of the square two-dimensional code pattern changed into the trapezoid due to perspective in the current image are (X) respectively5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8)。
Setting a current matrix
Figure BDA0001716161920000121
Obtaining a look-down matrix
Figure BDA0001716161920000122
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure BDA0001716161920000123
and the coordinate positioning module 96 is used for obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained by the distance obtaining module and the positioning coordinate information of the square two-dimensional code pattern in the current image.
The first judgment module 97 judges whether the unmanned vehicle reaches the target coordinates, and if not, the driving output module is executed.
And the driving output module 98 is used for comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator or brake.
The unmanned vehicle alignment system can accurately drive to the target coordinate in an automatic mode, and realizes high-precision alignment of the truck, the bridge crane and the gantry crane at an automatic port, so that the time is saved, and the overall operation efficiency is improved.
The embodiment of the invention also provides the unmanned vehicle alignment equipment which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the unmanned vehicle alignment method via execution of executable instructions.
As described above, the unmanned vehicle alignment equipment can accurately drive to the target coordinate in an automatic mode, and realizes high-precision alignment of the truck, the bridge crane and the gantry crane at an automatic port, so that the time is saved, and the overall operation efficiency is improved.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 9 is a schematic structural diagram of the unmanned vehicle aligning apparatus of the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 600 shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the unmanned vehicle alignment method when being executed. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
As described above, the program of the computer-readable storage medium of this embodiment can accurately travel to the target coordinates in an automated manner when executed, and realize high-precision alignment of the truck, the bridge crane, and the gantry crane in the automated port, thereby saving time and improving the overall operation efficiency.
Fig. 10 is a schematic structural diagram of a computer-readable storage medium of the present invention. Referring to fig. 10, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In summary, the unmanned vehicle alignment method, system, equipment and storage medium of the invention can accurately drive to the target coordinate in an automatic mode, and realize the high-precision alignment of the truck, the bridge crane and the gantry crane at an automatic port, thereby saving time and improving the whole operation efficiency.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (11)

1. An unmanned vehicle alignment method is characterized by comprising the following steps:
s101, shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of the driving direction by using a fixed camera of the unmanned vehicle; establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, and the coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid due to perspective deformation are respectively (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4);
Setting a first matrix
Figure FDA0002565942840000011
Second matrix
Figure FDA0002565942840000012
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB;
S102, arranging a plurality of square two-dimensional code patterns on a driving line, and establishing a mapping relation table of each two-dimensional code pattern and positioning coordinate information of the two-dimensional code pattern;
s103, shooting a current image of the driving direction by using a camera of the unmanned vehicle, collecting a two-dimensional code pattern in the current image, and stretching and deforming the two-dimensional code pattern in the image into a square two-dimensional code pattern;
s104, judging whether the two-dimensional code pattern in the current image hits one of the mapping relation tables, if so, executing the step S105, and if not, returning to the step S103;
s105, establishing a coordinate system according to the current image, wherein the coordinates of the four corners of the square two-dimensional code pattern which is changed into the trapezoid due to perspective deformation in the current image are respectively (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure FDA0002565942840000013
Obtaining a look-down matrix
Figure FDA0002565942840000014
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure FDA0002565942840000015
s106, obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained in the step S105 and the positioning coordinate information of the square two-dimensional code pattern in the current image;
s107, judging whether the unmanned vehicle reaches the target coordinates, if so, executing a step S109, and if not, executing a step S108;
s108, comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator braking or brake braking; and
and S109, ending.
2. The unmanned vehicle alignment method of claim 1, wherein: the step S108 includes
S1081, collecting the current speed V of the unmanned vehiclet
S1082, calculating a target acceleration
Figure FDA0002565942840000021
S1083, calculating the current actual acceleration of the vehicle by using a difference method, wherein VtIs the current vehicle speed, Vt-1The time interval between two sampling data is the speed of the vehicle at the previous moment
Figure FDA0002565942840000022
S1084, calculating the current target acceleration anAnd the actual acceleration arThe difference between them;
s1085, calculating the current required accelerator and brake values:
Figure FDA0002565942840000023
kp、ki、kdthe value ranges of (a), (b), (c) and (d) are (0, 1), and when u (t) is greater than zero, u (t) is an accelerator value; when u (t) is less than zero, u (t) is the brake valve value.
3. The unmanned vehicle alignment method of claim 2, wherein: in the step S1085, kp=0.6,ki=0.1,kd=0.3。
4. The unmanned vehicle alignment method of claim 1, wherein:
in the step S101, it is assumed that each pixel of the image corresponds to 1cm, and coordinates of four corners of the square two-dimensional code pattern in a top view state are (-0.5L, L), (0.5L, 2L), and (-0.5L, 2L), respectively;
then a second matrix corresponding to the top view state is established
Figure FDA0002565942840000024
5. The unmanned vehicle alignment method of claim 1, wherein: the origin of the rectangular coordinate system of the image in the step S101 is the lower left corner of the image, and is the extending direction of the X axis from the origin to the right, and is the extending direction of the Y axis from the origin to the top;
the origin of the coordinate system in step S105 is the lower left corner of the current image, and is the extending direction of the X axis from the origin to the right, and is the extending direction of the Y axis from the origin to the top.
6. The unmanned vehicle alignment method of claim 1, wherein: all the square two-dimensional code patterns in the step S101 have the same size.
7. The unmanned vehicle alignment method of claim 1, wherein: the order of step S101 and step S102 is exchanged.
8. The unmanned vehicle alignment method of claim 1, wherein: the resolution of the camera is 1024 x 768.
9. An unmanned vehicle alignment system for implementing the unmanned vehicle alignment method of any one of claims 1 to 8, comprising:
the transformation matrix obtaining module is used for shooting an image of a square two-dimensional code pattern with the side length of Lcm on the ground in front of the driving direction by adopting a fixed camera of the unmanned vehicle; establishing a coordinate system according to the image, wherein the origin of the coordinate system is one corner of the rectangular image, and the coordinates of the four corners of the square two-dimensional code pattern deformed into a trapezoid due to perspective deformation are respectively (X)1,Y1)、(X2,Y2)、(X3,Y3) And (X)4,Y4);
Setting a first matrix
Figure FDA0002565942840000031
Second matrix
Figure FDA0002565942840000032
Obtaining a transformation matrix M from the first matrix A and the second matrix B, M ═ ATA)-1ATB;
The mapping relation establishing module is used for setting a plurality of square two-dimensional code patterns on a driving line and establishing a mapping relation table of each two-dimensional code pattern and the positioning coordinate information of the two-dimensional code pattern;
the image stretching deformation module is used for shooting a current image in the driving direction by using a camera of an unmanned vehicle, acquiring a two-dimensional code pattern in the current image and stretching and deforming the two-dimensional code pattern in the image into a square two-dimensional code pattern;
the first judgment module is used for judging whether the two-dimensional code pattern in the current image hits one of the mapping relation tables, if so, the distance acquisition module is executed, and if not, the image stretching deformation enabling module is returned;
a distance obtaining module for establishing a coordinate system according to the current image, wherein the coordinates of the four corners of the square two-dimensional code pattern changed into trapezoid due to perspective deformation in the current image are respectively (X)5,Y5)、(X6,Y6)、(X7,Y7) And (X)8,Y8);
Setting a current matrix
Figure FDA0002565942840000041
Obtaining a look-down matrix
Figure FDA0002565942840000042
The distance between the unmanned vehicle and the two-dimensional code pattern in the current image is Scm,
Figure FDA0002565942840000043
the coordinate positioning module is used for obtaining the current coordinate of the unmanned vehicle according to the distance between the unmanned vehicle and the two-dimensional code pattern in the current image obtained by the distance obtaining module and the positioning coordinate information of the square two-dimensional code pattern in the current image;
the second judgment module is used for judging whether the unmanned vehicle reaches the target coordinate or not, and if not, the driving output module is executed;
and the driving output module is used for comparing the current coordinate and the target coordinate of the unmanned vehicle to carry out accelerator or brake.
10. An unmanned vehicle alignment apparatus, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the unmanned vehicle alignment method of any of claims 1-8 via execution of the executable instructions.
11. A computer readable storage medium storing a program which when executed performs the steps of the unmanned vehicle alignment method of any of claims 1 to 8.
CN201810708615.8A 2018-07-02 2018-07-02 Unmanned vehicle alignment method, system, equipment and storage medium Active CN108875689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810708615.8A CN108875689B (en) 2018-07-02 2018-07-02 Unmanned vehicle alignment method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810708615.8A CN108875689B (en) 2018-07-02 2018-07-02 Unmanned vehicle alignment method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108875689A CN108875689A (en) 2018-11-23
CN108875689B true CN108875689B (en) 2020-11-06

Family

ID=64298076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810708615.8A Active CN108875689B (en) 2018-07-02 2018-07-02 Unmanned vehicle alignment method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108875689B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110175662A (en) * 2019-05-05 2019-08-27 昆明理工大学 A kind of means of transportation two-dimension code recognition device and method
CN111275662B (en) * 2019-08-29 2024-02-06 上海飞机制造有限公司 Workpiece positioning method, device, equipment and storage medium based on two-dimension code
CN112446916A (en) * 2019-09-02 2021-03-05 北京京东乾石科技有限公司 Method and device for determining parking position of unmanned vehicle
CN112750166B (en) * 2020-12-29 2023-07-28 上海西井科技股份有限公司 Automatic calibration method, system, equipment and storage medium for camera of quay crane dome camera
CN113469045B (en) * 2021-06-30 2023-05-02 上海西井信息科技有限公司 Visual positioning method and system for unmanned integrated card, electronic equipment and storage medium
CN113743312B (en) * 2021-09-06 2022-05-17 广州唐斯科技有限公司 Image correction method and device based on vehicle-mounted terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104596525A (en) * 2014-12-29 2015-05-06 西南交通大学 Vehicle positioning method based on coded graphics
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
CN107632602A (en) * 2017-09-01 2018-01-26 上海斐讯数据通信技术有限公司 AGV trolley travelling tracks method for correcting error and system, terrestrial reference Quick Response Code acquisition device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201727415A (en) * 2016-01-27 2017-08-01 鴻海精密工業股份有限公司 Computer vision positioning system combining artificial marker and tow-dimensional code
US20180130008A1 (en) * 2016-11-06 2018-05-10 Yan Liu System and Method for Aerial Vehicle Automatic Landing and Cargo Delivery

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104596525A (en) * 2014-12-29 2015-05-06 西南交通大学 Vehicle positioning method based on coded graphics
CN105388899A (en) * 2015-12-17 2016-03-09 中国科学院合肥物质科学研究院 An AGV navigation control method based on two-dimension code image tags
CN107632602A (en) * 2017-09-01 2018-01-26 上海斐讯数据通信技术有限公司 AGV trolley travelling tracks method for correcting error and system, terrestrial reference Quick Response Code acquisition device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于人工路标的移动机器人定位导航***设计;徐德成;《中国优秀硕士学位论文全文数据库信息科技辑》;20170315;I138-5130 *

Also Published As

Publication number Publication date
CN108875689A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108875689B (en) Unmanned vehicle alignment method, system, equipment and storage medium
CN109242903B (en) Three-dimensional data generation method, device, equipment and storage medium
US11210534B2 (en) Method for position detection, device, and storage medium
CN109829947B (en) Pose determination method, tray loading method, device, medium, and electronic apparatus
CN109345596B (en) Multi-sensor calibration method, device, computer equipment, medium and vehicle
CN112764053B (en) Fusion positioning method, device, equipment and computer readable storage medium
CN112233136B (en) Method, system, equipment and storage medium for alignment of container trucks based on binocular recognition
CN110263713B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN108876857B (en) Method, system, device and storage medium for positioning unmanned vehicle
CN111366912B (en) Laser sensor and camera calibration method, system, device and storage medium
CN113213340B (en) Method, system, equipment and storage medium for unloading collection card based on lockhole identification
CN112198878B (en) Instant map construction method and device, robot and storage medium
CN113168189A (en) Flight operation method, unmanned aerial vehicle and storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN112784639A (en) Intersection detection, neural network training and intelligent driving method, device and equipment
CN109685851B (en) Hand-eye calibration method, system, equipment and storage medium of walking robot
CN114758163B (en) Forklift movement control method and device, electronic equipment and storage medium
CN110853098A (en) Robot positioning method, device, equipment and storage medium
CN114429631B (en) Three-dimensional object detection method, device, equipment and storage medium
CN113345023B (en) Box positioning method and device, medium and electronic equipment
CN113310484B (en) Mobile robot positioning method and system
CN112489240B (en) Commodity display inspection method, inspection robot and storage medium
CN113443387A (en) Port unmanned container truck alignment method, device, equipment and storage medium
CN112750166B (en) Automatic calibration method, system, equipment and storage medium for camera of quay crane dome camera
US20230375697A1 (en) System and Method for Support Structure Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Patentee after: Shanghai Xijing Technology Co.,Ltd.

Address before: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Patentee before: SHANGHAI WESTWELL INFORMATION AND TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder