CN115401689A - Monocular camera-based distance measuring method and device and computer storage medium - Google Patents
Monocular camera-based distance measuring method and device and computer storage medium Download PDFInfo
- Publication number
- CN115401689A CN115401689A CN202210919231.7A CN202210919231A CN115401689A CN 115401689 A CN115401689 A CN 115401689A CN 202210919231 A CN202210919231 A CN 202210919231A CN 115401689 A CN115401689 A CN 115401689A
- Authority
- CN
- China
- Prior art keywords
- measured
- point
- distance
- robot
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000003860 storage Methods 0.000 title claims abstract description 13
- 230000000007 visual effect Effects 0.000 claims abstract description 90
- 238000005259 measurement Methods 0.000 claims description 71
- 230000003287 optical effect Effects 0.000 claims description 44
- 238000000691 measurement method Methods 0.000 claims description 13
- 238000010408 sweeping Methods 0.000 description 46
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 238000003384 imaging method Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application discloses a distance measuring method, a device and a computer storage medium based on a monocular camera, wherein the distance measuring method comprises the following steps: acquiring a first image to be measured from a first visual field area of the monocular camera at a first position, and determining image coordinates of a point to be measured in the first image to be measured; acquiring relative position information between the first view area and the robot based on actual posture information of the mechanical arm; and determining a first actual distance between the point to be measured and the robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured. The distance measuring device can measure the absolute distance between any target point and the robot in the visual field range by utilizing the image acquired by the monocular camera and the relative position relation between the robot and the monocular camera, and the application capability and the application range of the monocular camera are expanded.
Description
Technical Field
The present disclosure relates to the field of distance measurement technologies, and in particular, to a distance measurement method and apparatus based on a monocular camera, and a computer storage medium.
Background
At present, in the fields or products of automatic automobile driving, sweeping robots and the like, accurate absolute distance measurement is carried out on a target object, which is a precondition for whether equipment can normally work.
The current devices for measuring distance mainly include the following devices: monocular camera, binocular camera, lidar, structured light camera light, and ultrasonic device, etc. Compared with other devices, the monocular camera has lower hardware cost and meets the product cost requirement; however, the conventional distance measurement algorithm for the monocular camera can only distinguish the relative distance relationship between different acquisition points, and cannot know the absolute distance between the acquisition points and the monocular camera, so that the monocular camera has a scale uncertainty problem when measuring the absolute distance.
Disclosure of Invention
The application provides a distance measuring method and device based on a monocular camera and a computer storage medium.
The technical scheme adopted by the application is that a distance measuring method based on a monocular camera is provided, the distance measuring method is applied to a robot, and the tail end of a mechanical arm of the robot is fixedly connected with the monocular camera;
the distance measuring method comprises the following steps:
acquiring a first image to be measured from a first visual field area of the monocular camera at a first position, and determining image coordinates of a point to be measured in the first image to be measured;
acquiring relative position information between the first view area and the robot based on actual posture information of the mechanical arm;
and determining a first actual distance between the point to be measured and the mobile robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured.
Through the mode, the distance measuring device can measure the absolute distance between any target point and the robot in the visual field range only according to the image acquired by the monocular camera and the relative position relation between the robot and the monocular camera, the application capability and the application range of the monocular camera are expanded, and additional parts, structural complexity and hardware cost are not required to be added.
Wherein the step of determining a first actual distance of the point to be measured from the robot comprises:
determining a first measurement distance of a projection optical axis of the point to be measured projected to the first visual field area based on the size information of the first visual field area, the size information of the first image to be measured and the image coordinates of the point to be measured;
determining a second measurement distance from a projection point of the point to be measured on the projection optical axis to the robot based on the relative position information of the first visual field area, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured;
determining a first actual distance between the point to be measured and the robot according to the first measurement distance and the second measurement distance;
the projection optical axis is a projection line of a main optical axis of the monocular camera on a plane where the first visual field area is located.
Through the mode, the distance measuring device determines the distance from the point to be measured to the projection optical axis of the monocular camera visual field area and the distance from the point to be measured to the projection point of the point to be measured on the projection optical axis to the robot, then the actual distance from the point to be measured to the robot can be simply calculated according to the Pythagorean theorem, and the calculating mode is simple and fast.
Wherein the determining a second measurement distance from a projection point of the point to be measured on the projection optical axis to the robot based on the relative position information, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured includes:
determining a closest measured distance of the robot to the first field of view region based on the relative position information;
determining a third measurement distance from the projection point to an upper bottom edge of the first visual field area based on the size information of the first visual field area, the image coordinates of the point to be measured and the size information of the first image to be measured, wherein the upper bottom edge is an edge between the projection point and the robot;
and determining a second measurement distance from the projection point to the robot according to the nearest measurement distance and the third measurement distance.
By the above mode, the distance measuring device determines the distance from the robot to the upper bottom edge of the visual field area and the distance from the point to be measured to the upper bottom edge of the visual field area, so that the measuring distance from the projection point of the point to be measured on the projection optical axis to the mobile robot can be simply calculated.
Wherein the determining a first measurement distance of the projection optical axis of the point to be measured projected to the first field of view region based on the size information of the first field of view region, the size information of the first image to be measured, and the image coordinates of the point to be measured includes:
determining a fourth measurement distance from the projection point to a side edge on the same side as the point to be measured in the first view field area based on the dimension information of the first view field area, the dimension information of the first image to be measured and the image coordinates of the point to be measured;
determining a proportional relation between the first measuring distance and the fourth measuring distance based on the image coordinates of the point to be measured and the size information of the first image to be measured;
and determining a first measurement distance from the point to be measured to a projection optical axis of the first visual field area according to the proportional relation and the fourth measurement distance.
Through the mode, the distance measuring device obtains the measuring distance from the projection point to the side edge of the visual field area, simple mathematical transformation of a similar triangle is facilitated, and the measuring distance from the point to be measured to the projection optical axis is calculated according to the measuring distance from the projection point to the side edge of the visual field area.
Wherein the distance measuring method further comprises:
acquiring a second image to be measured from a second visual field area of the monocular camera at a second position, wherein the point to be measured is in the second image to be measured;
acquiring movement information of the mobile robot moving from the first position to the second position;
and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance.
Through the mode, after the distance measuring device determines the actual distance between the robot at the initial position and the point to be measured, for the condition that the robot moves to other positions but the point to be measured is still in the visual field area of the monocular camera, the distance measuring device can further calculate the actual distance of the moving position according to the actual distance of the initial position, and the calculation process and the calculation complexity can be effectively simplified.
Wherein the determining a second actual distance between the point to be measured and the robot at the second position according to the moving distance and the first actual distance comprises:
acquiring a first direction from the point to be measured to the robot at the first position and a second direction from the point to be measured to the robot at the second position;
determining relative angle information of the first location and the second location based on the first direction and the second direction;
and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information, the first actual distance and the relative angle information.
Through the mode, the distance measuring device can directly and quickly determine the actual distance between the robot at the moving position and the point to be measured by specifically utilizing the actual distance between the robot at the initial position and the point to be measured and the relative angle relationship between the initial position and the moving position.
Wherein said determining relative angular information of said first location and said second location based on said first direction and said second direction comprises:
acquiring a movement angle of the robot moving from the first position to the second position;
determining an included angle between a connecting line of the point to be measured and the robot and the projection optical axis based on the first measuring distance and the second measuring distance;
and determining relative angle information of the first position and the second position based on the movement angle and the included angle.
Through the mode, the distance measuring device measures the included angle between the initial position and the moving position by using the odometer and/or other measuring instruments arranged on the robot, so that the relative angle information of the initial position and the moving position is obtained through calculation, other parts are not required to be additionally added, and the hardware cost can be effectively reduced.
The distance measuring method further comprises the following steps:
acquiring a third image to be measured from a third visual field area of the monocular camera at a third position, and determining image coordinates of other points to be measured in the third image to be measured, wherein the robot keeps still during the movement from the first position to the third position, and the position of the monocular camera relative to the robot changes;
acquiring updated relative position information between the third visual field area and the robot based on the updated actual posture information of the mechanical arm;
determining a third actual distance between the other points to be measured and the robot based on the updated relative position information, the size information of the third image to be measured, and the image coordinates of the other points to be measured.
By the mode, the distance measuring device can measure the absolute distance between any target point and the robot in the visual field range only according to the image acquired by the monocular camera and the relative position relation between the robot and the monocular camera under the condition that the position of the robot is kept still and the position of the monocular camera is changed, and the application range of the distance measuring method is effectively expanded.
Another technical solution that this application adopted provides a distance measurement device based on monocular camera, distance measurement device includes: the device comprises an acquisition module, a position module and a measurement module; wherein the content of the first and second substances,
the acquisition module is used for acquiring a first image to be measured from a first visual field area of the monocular camera at a first position and determining image coordinates of points to be measured in the first image to be measured;
the position module is used for acquiring relative position information between the first view field and the robot based on actual posture information of the mechanical arm;
the measuring module is used for determining a first actual distance between the point to be measured and the robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured.
The measurement module is further configured to determine a first measurement distance of the projection optical axis of the point to be measured projected to the first view field area based on the size information of the first view field area, the size information of the first image to be measured, and the image coordinates of the point to be measured; determining a second measurement distance from a projection point of the point to be measured on the projection optical axis to the robot based on the relative position information of the first visual field area, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured; determining a first actual distance between the point to be measured and the robot according to the first measuring distance and the second measuring distance; the projection optical axis is a projection line of a main optical axis of the monocular camera on a plane where the first visual field area is located.
Wherein the measurement module is further configured to determine a closest measurement distance of the robot to the first field of view based on the relative position information of the first field of view; determining a third measurement distance from the projection point to an upper bottom edge of the first visual field area based on the size information of the first visual field area, the image coordinates of the point to be measured and the size information of the first image to be measured, wherein the upper bottom edge is an edge between the projection point and the robot; and determining a second measurement distance from the projection point to the robot according to the nearest measurement distance and the third measurement distance.
The measuring module is further configured to determine a fourth measuring distance from the projection point to the side edge of the first view field area on the same side as the point to be measured, based on the size information of the first view field area, the size information of the first image to be measured, and the image coordinates of the point to be measured;
determining a proportional relation between the first measuring distance and the fourth measuring distance based on the image coordinates of the point to be measured and the size information of the first image to be measured;
and determining a first measurement distance from the point to be measured to a projection optical axis of the first visual field area according to the proportional relation and the fourth measurement distance.
The measuring module is further used for acquiring a second image to be measured from a second visual field area of the monocular camera at a second position, wherein the point to be measured is in the second image to be measured; acquiring movement information of the robot moving from the first position to the second position; and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance.
The measuring module is further used for acquiring a first direction from the point to be measured to the robot at the first position and a second direction from the point to be measured to the robot at the second position; determining relative angle information of the first location and the second location based on the first direction and the second direction; and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information, the first actual distance and the relative angle information.
The measuring module is further used for acquiring a movement angle of the robot moving from the first position to the second position; determining an included angle between a connecting line of the point to be measured and the robot and the projection optical axis based on the first measuring distance and the second measuring distance; and determining relative angle information of the first position and the second position based on the movement angle and the included angle.
Another technical solution adopted by the present application is to provide another distance measuring apparatus of a monocular camera, which includes a memory and a processor coupled to the memory;
wherein the memory is configured to store program data and the processor is configured to execute the program data to implement the distance measuring method as described above.
Another technical solution adopted by the present application is to provide a computer storage medium for storing program data, which when executed by a computer, is used to implement the distance measuring method as described above.
The beneficial effect of this application is: the distance measuring device acquires a first image to be measured from a first visual field area of the monocular camera at a first position and determines image coordinates of a point to be measured in the first image to be measured; acquiring relative position information between the first view area and the robot based on actual posture information of the mechanical arm; and determining a first actual distance between the point to be measured and the robot based on the relative position information of the first visual field area, the size information of the first image to be measured and the image coordinates of the point to be measured. The distance measuring device can measure the absolute distance between any target point and the robot in the visual field range by utilizing the image acquired by the monocular camera and the relative position relation between the robot and the monocular camera, and the application capability and the application range of the monocular camera are expanded.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an embodiment of a monocular camera-based distance measuring method provided in the present application;
FIG. 2 is an isometric view of a first scene of a distance measurement method provided herein;
FIG. 3 is a schematic diagram of an imaged picture of a first scene of a distance measuring method provided by the present application;
FIG. 4 is a top view of a first scenario of a distance measurement method provided herein;
FIG. 5 is a front view of a first scenario of a distance measurement method provided herein;
FIG. 6 is a schematic diagram of points to be measured in an imaging screen provided by the present application;
FIG. 7 is a diagram showing the position relationship between the point to be measured and other known points provided in the present application;
FIG. 8 is a schematic flowchart of another embodiment of a monocular camera based distance measuring method provided by the present application;
FIG. 9 is an isometric view of a second scenario of a distance measurement method provided herein;
FIG. 10 is an isometric view of a third scenario of a distance measurement method provided herein;
FIG. 11 is a front view of a third scenario of a distance measuring method provided herein;
FIG. 12 is a schematic structural diagram of an embodiment of a monocular camera-based distance measuring device according to the present application;
fig. 13 is a schematic structural diagram of another embodiment of a distance measuring device of a monocular camera according to the present application;
FIG. 14 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a monocular camera-based distance measuring method according to the present disclosure.
The distance measuring method in the embodiment of the application can be applied to robots, such as other types of mobile robots including sweeping robots, logistics robots, mowing robots, cleaning robots, or carrying robots, and can also be applied to processing systems or processors mounted in the mobile robots, or control systems other than the mobile robots. Hereinafter, a mobile robot will be described as an example.
As shown in fig. 1, the distance measuring method in the embodiment of the present application may specifically include the following steps:
step S11: a first image to be measured is acquired from a first field of view region of a monocular camera at a first position, and image coordinates of points to be measured in the first image to be measured are determined.
In the embodiment of the present application, a specific scene of the distance measurement method of the present application is described by taking a floor sweeping robot, two joints of a robot arm installed on the floor sweeping robot, and a monocular camera at the end of the robot arm as an example, specifically, as shown in fig. 2 to 5, fig. 2 is an axonometric view of the scene of the distance measurement method provided by the present application, fig. 3 is a schematic view of an imaging picture of the distance measurement method provided by the present application, fig. 4 is a top view of the scene of the distance measurement method provided by the present application, and fig. 5 is a front view of the scene of the distance measurement method provided by the present application.
Specifically, the sweeping robot is in an initial pose state, and at the moment, the sweeping robot is located at a first position. As shown in fig. 2, the sweeping robot body on the left side in the drawing is placed on a flat working surface, such as a horizontal ground, a point in the middle of the front of the sweeping robot body is S, and the point S is used for representing the position of the sweeping robot in the following description. In the embodiment of the present application, the point S may represent a first position.
Two mechanical arm joints are installed on the sweeping robot body, and in other embodiments, one or other number of mechanical arm joints can be installed, which is not listed here. The end point O of the mechanical arm is provided with a monocular camera, the main optical axis OP of the monocular camera is oriented to be right ahead and is inclined downwards, and the point Q is an object in the visual field area of the monocular camera, namely, a point Q to be measured in fig. 2.
According to the imaging principle of the monocular camera, when the included angle between the main optical axis OP of the monocular camera and the working surface is smaller than 90 °, the view area (e.g., the first view area) of the monocular camera is a trapezoid area ABCD, and the corresponding frame acquired by the monocular camera is a rectangle a 'B' C 'D' as shown in fig. 3, i.e., the first image to be measured has a size w × h, w is the width, and h is the height. In the first image to be measured a 'B' C 'D', the image coordinates of the point to be measured Q may be represented as (, j).
It should be noted that, for convenience of observation, the visual field area ABCD is shown as a trapezoid with thickness in fig. 2, but the visual field area ABCD is not thick as a working plane, and therefore, the subsequent discussion assumes that the visual field area ABCD is not thick. In fig. 4 and 5, the point M is the midpoint of the line segment AB, the point T is the midpoint of the line segment CD, and M 'is the image point corresponding to the point M in the imaged picture a' B 'C' D ', T' is the image point of the point T, then M 'is the midpoint of the line segment a' B ', and T' is the midpoint of the line segment C 'D'.
In fig. 4, where γ and β are the distal opening angle and the proximal opening angle of the monocular camera in the image width direction, respectively, in general, these two parameters are fixed after the factory manufacture and thus may be calculated as known values. Since the point M is the midpoint of the line segment AB, the point T is the midpoint of the line segment CD, and the flare angle of the monocular camera in the image width direction has symmetry, OM and OT are angular bisectors of the angle AOB and the angle COD respectively.
In fig. 5, ρ is the opening angle of the monocular camera in the image height direction, and this parameter is generally fixed after the camera is manufactured and shipped, and thus may be calculated as a known value. Since OP is the main optical axis and the flare angle of the camera over the image height square has symmetry, OP is the angular bisector of ≈ TOM.
Step S12: and acquiring relative position information between the first view field and the mobile robot based on the actual posture information of the mechanical arm.
In the embodiment of the present application, the actual posture information of the mechanical arm includes the length of the mechanical arm, and an angular relationship between the mechanical arm and the working surface, such as an included angle. The distance measuring device can calibrate the relative position information of the sweeping robot and the visual field area according to the actual posture information of the mechanical arm, including the relative distance from the sweeping robot to any point on the visual field area, such as the distance from the sweeping robot to a point T and a point M on an ABCD of the visual field area.
Optionally, for convenience of description, for the sweeping robot to be located in the initial pose shown in fig. 2, 4, and 5, in the embodiment of the present application, calibration is performed on relevant data of the sweeping robot:
height of sweeping robot body and two mechanical arms J 1 H 2 、J 2 The length of O is fixed and known, and is marked by l and l respectively 1 、l 2 The unit can be centimeter. Wherein the point J is the point J 1 A projected point on the work surface.
The included angles between the two mechanical arms and the working surface are known and are respectively marked as theta 1 And theta 2 。
OS 0 Perpendicular to the working plane at S 0 ,OS 0 And SS 0 Are known and are each denoted by l 0 、l 3 In centimeters.
When the position, the angle and the height from the working surface of the monocular camera are fixed and known values, the distance from the mobile robot body to the visual field area of the monocular camera and the visual field width are also fixed and known, so S 0 T、S 0 The lengths of both P and TM are known, denoted l respectively 4 、l p And l 5 The units cm, AB and CD are also known and are denoted as l 6 And l 7 In centimeters.
Step S13: and determining a first actual distance between the point to be measured and the mobile robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured.
In the embodiment of the present application, after the calibration of the data in step S11 and step S12, the distance measuring apparatus may further calculate the actual distance between the point to be measured and the mobile robot based on the data calibrated in the above steps.
Specifically, with reference to fig. 6 and fig. 7, fig. 6 is a schematic diagram of a point to be measured in an imaging screen, and fig. 7 is a positional relationship between the point to be measured and other known points.
Assuming that the sweeping robot is in the state shown in fig. 7, in the calibration process of step S12, l and l are known k (k =0,1, \8230;, 7) are known values. Fig. 6 shows any pixel point Q' (i, j) in the imaged picture, where i is the abscissa and j is the ordinate. In fig. 7, a pixel point Q' is an image point of the point Q to be measured, and the distance measurement method of the present application aims to obtain an actual distance between the point Q to be measured and the sweeping robot S.
In FIG. 7, QN is parallel to MB and intersects TM at point N and intersects CB at point L. Clearly, QN is perpendicular to TM and the foot is point N. Then, extending BC to S 0 T crosses point K.
In conjunction with fig. 6 and 7, the following line segment proportionality relationship can be derived:
TN+NM=l 5 (2)
the following equations (1) and (2) can be solved:
in fig. 7, similar triangles Δ KTC to Δ KNL and Δ KMB to Δ KNL exist, and therefore, the following relationship exists:
where the expressions to the right of equation (4) and equation (5) equal sign are equal, therefore, the following equation can be derived:
wherein, in the formula (6), TC = CD/2= l 7 2, and MB = AB/2= l 6 /2, due to l 7 And l 6 Are known, and thus, TC and MB are also known.
And, l in the formula (7) 5 It is also known that:
KM-KT=TM=l 5 (7)
therefore, there is only one unknown, KN, in the above equation (6), from which it is solved:
KM=l 5 l 6 /(l 6 -l 7 ) (8)
KT=l 5 l 7 /(l 6 -l 7 ) (9)
substituting KN = KT + TN into the above equation (4) includes:
mixing TC = CD/2= l 7 The formula (10) is substituted by the formula (2), the formula (3) and the formula (9), and the formula (10) comprises the following components:
by solving equation (11), we can solve:
since the imaging point Q 'of the point Q has the pixel coordinate (i, j) in the imaging picture, the pixel distance between it and the imaging point N' of N is | i-w/2|, and NL has the length of w/2 pixels in the imaging picture, then:
it is noted that the establishment of the above equation requires a premise that the distribution of the points on NL (including its extension on the other side of TM) on the corresponding portion N 'L' (including its extension on the other side of T 'M') of the imaging screen is uniform. In general, this premise is true for an ideal camera, and is also approximately true for a real camera, with little error.
Based on this, the above formula (12) and formula (13) are combined to obtain:
for simplifying writing, the embodiment of the present application defines the symbol l in the above formula (14) NQ . Because the Rt delta SNQ right triangle is formed, the Pythagorean theorem shows that:
SQ 2 =(SS 0 +S 0 T+TN) 2 +NQ 2 (15)
will SS 0 =l 3 、S 0 T=l 4 And the following formula (3) and formula (14) are substituted into the above formula:
therefore, the distance measuring device can directly calculate the actual distance SQ from the point to be measured to the sweeping robot by substituting i, j, w and h into the expression (16).
Wherein, w and h are image sizes, and after the camera position and the visual field area are determined, the image sizes are calibrated known items; and i, j are the image coordinates of a point on the image. The corresponding relation between the points to be measured in the visual field area and the points in the image can be obtained by any one of the following modes: firstly, knowing the corresponding relation between a visual field area and an image, calibrating the position of a point to be measured in the visual field area, and then mapping the position of the measuring point in the visual field area to the image according to the corresponding relation between the visual field area and the image, thereby determining the image coordinate of the measuring point; secondly, marking the measuring points in the visual field area, and identifying the marked measuring points in the image so as to determine the image coordinates of the measuring points on the image.
In a specific embodiment, the distance measuring device may determine a first measurement distance NQ of the projection optical axis of the point to be measured projected to the first visual field area based on the size information of the first visual field area ABCD, the size information of the first image to be measured a 'B' C 'D', and the image coordinates (i, j) of the point to be measured; and determining a second measurement distance SN from a projection point of the point to be measured on the projection optical axis to the mobile robot based on the relative position information between the first visual field area ABCD and the robot, the size information of the first image to be measured a 'B' C 'D', the size information of the first visual field area ABCD, and the image coordinates (i, j) of the point to be measured.
The specific calculation process of the first measurement distance NQ is as follows:
determining a fourth measurement distance NL from the projection point to a side edge of the first visual field area on the same side as the point to be measured, based on the size information of the first visual field area ABCD, the size information of the first image to be measured a 'B' C 'D', and the image coordinates of the point to be measured; determining a proportional relation between the first measurement distance and the fourth measurement distance based on the image coordinates of the point to be measured and the size information of the first image to be measuredAccording to the proportional relationAnd the fourth measurement distance NL determines a first measurement distance from the point to be measured to a projection optical axis of the first field of view region. The formula for calculating the first measurement distance NQ is as follows:
the specific calculation process of the second measurement distance SN is as follows:
determining a closest measured distance ST of the mobile robot to the first field of view region based on the relative position information of the first field of view region; determining a third measurement distance TN from the projection point to an upper bottom edge of the first view area based on the size information of the first view area, the image coordinates of the point to be measured and the size information of the first image to be measured, wherein the upper bottom edge is an edge between the projection point and the mobile robot; and determining a second measurement distance SN from the projection point to the mobile robot according to the nearest measurement distance and the third measurement distance. The formula for calculating the second measurement distance SN is as follows:
SN=l 3 +l 4 +jl 5 /h
then, the distance measuring device can calculate the actual distance from the point to be measured to the sweeping robot according to the first measuring distance NL and the second measuring distance SN and the pythagorean theorem, namely, as shown in the formula (16).
In the embodiment of the application, the distance measuring device acquires a first image to be measured from a first visual field area of the monocular camera at a first position and determines image coordinates of a point to be measured in the first image to be measured; acquiring relative position information between the first visual field area and the mobile robot based on actual posture information of the mechanical arm; and determining a first actual distance between the point to be measured and the mobile robot based on the relative position information of the first visual field area, the size information of the first image to be measured and the image coordinates of the point to be measured. The distance measuring device provided by the embodiment of the application can measure the absolute distance between any target point and the mobile robot in the visual field range by utilizing the image acquired by the monocular camera and the relative position relationship between the mobile robot and the monocular camera, and the application capability and the application range of the monocular camera are expanded.
In the above embodiment, the distance measuring device calculates the actual distance between the point to be measured and the sweeping robot when the sweeping robot is in the initial pose state, and further, when the sweeping robot is changed from the initial pose state to another moving state, that is, when the sweeping robot moves from the first position to the second position, if the point to be measured is still within the visual field range of the monocular camera, the actual distance between the point to be measured and the second position can also be determined. It should be noted that, the movement from the first position to the second position refers to the movement of the sweeping robot, but the relative poses of the mechanical arm and the monocular camera do not change with respect to the sweeping robot.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating another embodiment of the monocular camera based distance measuring method according to the present disclosure.
As shown in fig. 8, the distance measuring method according to the embodiment of the present application may specifically include the following steps:
step S21: a second image to be measured is acquired from a second field of view area of the monocular camera at a second position, wherein the point to be measured is in the second image to be measured.
In the embodiment of the present application, the point to be measured here is the same point in real space as the point to be measured in the embodiment shown in fig. 1, that is, after the sweeping robot moves from the first position to the second position, the point to be measured still falls within the visual field area of the monocular camera.
The second position is the actual spatial position of the sweeping robot after the sweeping robot moves from the first position, and the second view field area is the area where the image acquisition range of the monocular camera is projected to the plane of the sweeping robot after the sweeping robot reaches the second position.
Step S22: movement information of the robot moving from a first position to a second position is acquired.
The mobile information comprises the moving distance of the mobile robot from the first position to the second position, and the shooting angle change condition of the monocular camera from the first position to the second position of the mobile robot.
In the examples of the present application, l and l are as shown in the above examples k (k =0,1, \8230;, 7) are known, the width w and the height h of the imaged picture shown in fig. 3 are known, and the sweeping robot is now moved from the first position (denoted S) in which fig. 2 is located to another second position (denoted S) 2 ) As shown in fig. 9, S can be calculated by an odometer and/or other hardware installed on the sweeping robot 2 Position and orientation relative to S, i.e. line segment SS 2 Sum of length of- 2 Angle a of (a). In both positions, the point Q to be measured is in the field of view of the monocular camera, the image point of the point Q to be measured in the imaging picture of the monocular camera at the position S is Q', and the pixel coordinate is (i, j). The distance measuring method aims to obtain the position S of the sweeping robot 2 Distance between lower and point Q, i.e. line segment S 2 The length of Q.
Step S23: and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance.
In the embodiment of the present application, SS 0 =l 3 、S 0 T=l 4 And the formula (3) shows that:
for the sake of simplified writing, the symbol l is defined in the above formula (17) SN . As can be seen from the example shown in fig. 1, Δ SNQ is a right triangle, and as can be seen from equations (14), (16) and (17), three sides NQ, SQ and SN of the triangle can be calculated, therefore:
can also be calculated, wherein l NQ Defines the formula (14) above, so that the angle difference < QSS 2 The = α -angle NSQ can also be calculated.
From the above, in Δ QSS 2 In, adjacent sides SQ and SS 2 Has been calculated, then, according to the cosine theorem, S 2 The length of Q is calculated as follows:
therefore, the distance measuring device only needs to ensure SQ and angle QSS 2 The actual distance S between the point to be measured and the sweeping robot can be directly calculated and obtained by substituting the formula (19) 2 Q。
In a specific embodiment, the distance measuring device acquires a first direction of the point to be measured to the mobile robot at the first position S and the point to be measured to the second position S 2 A second direction of the mobile robot; based on the first direction and the second direction, the relative angle information < QSS of the first position and the second position can be determined 2 Or cos < QSS 2 (ii) a According to the moving distance SS 2 The first actual distance SQ and the relative angle information < QSS 2 Or cos < QSS 2 It is determined that the point to be measured and the point at the second position S 2 Second actual distance S of the mobile robot 2 Q。
Specifically, relative angle information < QSS 2 Or cos < QSS 2 The calculation process of (2) is as follows: acquiring a movement angle alpha of the mobile robot moving from the first position to the second position; determining an included angle NSQ between a connection line of the point to be measured and the mobile robot and the projection optical axis based on the first measuring distance NQ and the second measuring distance SN; determining relative angle information of the first position and the second position based on the movement angle alpha and the included angle NSQQSS 2 Or cos < QSS 2 。
In the embodiment of the application, the monocular camera can simultaneously measure the absolute distance of any target object in the visual field range in multiple directions in the process of finishing image acquisition, so that the application capability of the monocular camera is expanded; compared with a binocular camera and a structured light camera, the monocular camera is simple in structure and low in cost, so that the embodiment of the application can obtain similar effects of the binocular camera and the structured light camera by using a low-cost scheme; compared with two devices, namely a radar device and an ultrasonic device, which can not acquire the visual information of the environment when measuring the distance, the monocular camera can simultaneously realize distance measurement and acquire the visual information, and the cost of the whole scene device can be reduced.
Further, the distance measuring method can also be applied to sweeping robots in other motion states. For example, when the sweeping robot remains unchanged, the relative position of the monocular camera and the sweeping robot changes, i.e., the monocular camera moves from the first position to the third position.
In an embodiment of the present application, the distance measuring method further includes: acquiring a third image to be measured from a third visual field area of a monocular camera at a third position, and determining image coordinates of other points to be measured in the third image to be measured, wherein the robot keeps still in the process of moving from the first position to the third position, and the position of the monocular camera relative to the robot changes; acquiring updated relative position information between the third visual field area and the robot based on the updated actual posture information of the mechanical arm; determining a third actual distance between the other points to be measured and the robot based on the updated relative position information, the size information of the third image to be measured, and the image coordinates of the other points to be measured.
Specifically, referring to fig. 10, for convenience of illustration, the angle of the robot arm and the field of view of the camera in the two states are plotted in the same graph.
In FIG. 10, in the first position, state S, the monocular camera position is O, the projection on the work surface isS 0 At this time, the view field of the monocular camera is a trapezoid ABCD, and the midpoints of the upper bottom and the lower bottom of the trapezoid are T and M, respectively. The position of the mobile robot is not changed, and then the monocular camera is adjusted downward until reaching O in fig. 10 2 Position to move the monocular camera from the first position O to the third position O 2 。
In the third position, i.e. state S 2 The monocular camera position is O 2 The mechanical arm joint 1 and the mechanical arm joint 2 are respectively J 12 J 22 And J 22 O 2 ,O 2 The projection on the working surface is S 02 At this time, the view of the monocular camera is trapezoidal A 2 B 2 C 2 D 2 I.e. third field of view, the midpoints of the upper and lower bases of the trapezoid being T 2 And M 2 . Since fig. 10 has too many lines, the main optical axes of the camera in the two states are not drawn, and for this reason, the front view of fig. 10 is drawn as shown in fig. 11.
In FIG. 11, OP, O 2 P 2 Are states S, S respectively 2 The main optical axis of the monocular camera. With respect to fig. 10 and 11, the following points are true:
the state S can be calculated by the hardware structure of the sweeping robot and the data of related components 2 The next two mechanical arm joints J 12 J 22 And J 22 O 2 The angles between the working surfaces are respectively recorded as theta 12 And theta 22 I.e. updated actual pose information of the robot arm.
Because the installation mode of the monocular camera is fixed relative to the mechanical arm joint 2, the included angle between the mechanical arm joint 2 and the main optical axis of the monocular camera is the same under the two states, namely ^ J 2 OP=∠J 22 O 2 P 2 。
Since the monocular camera itself is fixed in structure, that is, the view angle thereof is fixed, the view angles in the two states have the following relationship: angle TOM = < T > 2 O 2 M 2 、∠AOB=∠A 2 O 2 B 2 、∠COD=∠C 2 O 2 D 2 。
Because the field angle of the monocular camera has symmetry, O 2 M 2 、O 2 T 2 Is respectively < A 2 O 2 B 2 、∠C 2 O 2 D 2 Is measured.
From the above equation (16), it is found that the SQ at that time is obtained 1 Only need to know Andthe value of (2) is sufficient. Wherein the updated relative position between the third field of view and the robot comprises l 32 、l 42 And l 52 . The values are found in a few small steps.
Find l 32
As can be seen from fig. 5 and 11:
l 1 cosθ 1 +l 2 cosθ 2 =JS+SS 0 (20)
l 1 cosθ 12 +l 2 cosθ 22 =JS+SS 02 (21)
will be provided withSubstituting equation (20) can calculate JS, and substituting JS into equation (21) can obtain:
find l 42
As is known from fig. 5 and 11:
∠J 2 OP=∠J 2 OS 0 +∠S 0 OP=(90°-θ 2 )+∠S 0 OP (23)
∠J 22 O 2 P 2 =∠J 22 O 2 S 02 +∠S 02 O 2 P 2 =(90°-θ 22 )+∠S 02 O 2 P 2 (24)
because of ≈ J 2 OP=∠J 22 O 2 P 2 Therefore:
due to the fact thatIs known, and Δ OS 0 P is a right-angled triangle, so that the angle S is 0 OP=arctan(l p /l 0 ) Substituting equation (26) can obtain:
∠S 02 O 2 P 2 =(θ 22 -θ 2 )+arctan(l p /l 0 ) (27)
because of ≈ T 2 O 2 M 2 Is the flare angle of the monocular camera in the image height direction, so the monocular camera has- 2 O 2 M 2 ≡ ρ. And because of O 2 P 2 Is the symmetry of the main optical axis and the camera on the field angle, so it is < T 2 O 2 M 2 So there are:
∠T 2 O 2 P 2 =∠T 2 O 2 M 2 /2=ρ/2 (28)
considering the right triangle Rt Δ O 2 S 02 P 2 The method comprises the following steps:
∠S 02 O 2 T 2 =∠S 02 O 2 P 2 -∠T 2 O 2 P 2 (29)
combining the formula (27) to the formula (29), it is possible to obtain:
for the convenience of writing, the above formula defines the symbolBecause of Δ O 2 S 02 P 2 Is a right triangle, so the combination of the formula (30) and the formula (31) shows that:
find l 52
At right triangle delta O 2 S 02 M 2 In the specification, the following are:
∠S 02 O 2 M 2 =∠S 02 O 2 T 2 +∠T 2 O 2 M 2 (33)
at right triangle delta O 2 S 02 M 2 By applying the formula (31) and the formula (34), it is possible to obtain:
a combination of formula (32) and formula (35) having:
find l 62 And l 72
At right triangle delta O 2 S 02 M 2 In the middle, the pythagorean theorem is utilized to know that:
by substituting equation (31) and equation (35) into equation (37), the following can be obtained:
because of the fact thatIs the fixed view angle of the monocular camera and O 2 M 2 Is < A 2 O 2 B 2 And the field of view of the monocular camera is symmetrical, so Δ A 2 O 2 B 2 Is an isosceles triangle, thus Δ O 2 M 2 A 2 Is a right triangle, so there are:
by substituting equation (38) into equation (39):
at right triangle delta O 2 S 02 T 2 In the above, the same reasoning steps are adopted to obtain:
by means of the equation (16), the image coordinates of the other points to be measured of the third image to be measured, i.e. the image point Q' 1 (i, j) corresponding to point Q of real space 1 The distance from point S, i.e. the third actual distance, is:
as can be derived from the distance measurement at the third position, the monocular camera of the sweeping robot moves to the third position, which is substantially the same as the distance measurement process when the robot is at the first position, that is, the moving of the sweeping robot from the first position to the third position is only performed when the relative position between the monocular camera and the sweeping robot changes, but the actual distance from the point to be measured to the sweeping robot does not change.
In conclusion, it can be seen that the distance measuring method of the present application is suitable for simple and fast calculation of the same point to be measured when the mobile robot moves in different directions and moves in different states, and for simple and fast calculation of different points to be measured in the field of view of the monocular camera.
Further, the distance measuring method can effectively solve the problems of robot obstacle avoidance, robot tracing and the like by accurately identifying the actual distance between the robot and the point to be measured in the physical space. For example, for a sweeping robot, an obstacle needs to be avoided for an obstacle in a room, at this time, the sweeping robot can recognize the obstacle from a shot image by shooting an environment in the room, and an actual distance between the current sweeping robot and the obstacle is obtained by using the distance measuring method of the present application, and finally, an accurate obstacle avoiding function is realized according to the actual distance.
The above embodiments are only examples of the present disclosure, and do not limit the technical scope of the present disclosure, so that any minor modifications, equivalent changes or modifications made from the above embodiments according to the spirit of the present disclosure will still fall within the technical scope of the present disclosure.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an embodiment of a monocular camera-based distance measuring device according to the present disclosure. The distance measuring device 30 includes an acquisition module 31, a position module 32, and a measurement module 33.
The acquisition module 31 is configured to acquire a first image to be measured from a first field of view area of the monocular camera at a first position, and determine image coordinates of a point to be measured in the first image to be measured.
The position module 32 is configured to acquire relative position information between the first view area and the robot based on actual posture information of the robot arm.
The measuring module 33 is configured to determine a first actual distance between the point to be measured and the robot based on the relative position information of the first field of view, the size information of the first image to be measured, and the image coordinates of the point to be measured.
The measuring module 33 is further configured to determine a first measuring distance of the projection optical axis of the point to be measured projected to the first view field area based on the size information of the first view field area, the size information of the first image to be measured, and the image coordinates of the point to be measured; determining a second measurement distance from a projection point of the point to be measured on the projection optical axis to the robot based on the relative position information of the first visual field area, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured; determining a first actual distance between the point to be measured and the robot according to the first measurement distance and the second measurement distance; the projection optical axis is a projection line of a main optical axis of the monocular camera on a plane where the first visual field area is located.
Wherein the measuring module 33 is further configured to determine a closest measuring distance from the robot to the first field of view based on the relative position information of the first field of view; determining a third measurement distance from the projection point to an upper bottom edge of the first visual field area based on the size information of the first visual field area, the image coordinates of the point to be measured and the size information of the first image to be measured, wherein the upper bottom edge is an edge between the projection point and the robot; and determining a second measurement distance from the projection point to the robot according to the nearest measurement distance and the third measurement distance.
The measuring module 33 is further configured to determine a fourth measuring distance from the projection point to the side edge of the first view field area on the same side as the point to be measured, based on the size information of the first view field area, the size information of the first image to be measured, and the image coordinates of the point to be measured;
determining a proportional relation between the first measuring distance and the fourth measuring distance based on the image coordinates of the point to be measured and the size information of the first image to be measured;
and determining a first measurement distance from the point to be measured to a projection optical axis of the first visual field area according to the proportional relation and the fourth measurement distance.
The measuring module 33 is further configured to acquire a second image to be measured from a second visual field area of the monocular camera at a second position, where the point to be measured is in the second image to be measured; acquiring movement information of the mobile robot moving from the first position to the second position; and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance.
The measuring module 33 is further configured to acquire a first direction from the point to be measured to the robot at the first position, and a second direction from the point to be measured to the mobile robot at the second position; determining relative angle information of the first location and the second location based on the first direction and the second direction; and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information, the first actual distance and the relative angle information.
The measuring module 33 is further configured to obtain a moving angle of the robot moving from the first position to the second position; determining an included angle between a connecting line of the point to be measured and the robot and the projection optical axis based on the first measuring distance and the second measuring distance; and determining relative angle information of the first position and the second position based on the movement angle and the included angle.
Referring to fig. 13, fig. 13 is a schematic structural diagram of another embodiment of a distance measuring device of a monocular camera according to the present application. The distance measuring apparatus 500 of the embodiment of the present application includes a processor 51, a memory 52, an input/output device 53, and a bus 54.
The processor 51, the memory 52, and the input/output device 53 are respectively connected to the bus 54, the memory 52 stores program data, and the processor 51 is configured to execute the program data to implement the distance measuring method according to the above embodiment.
In the embodiment of the present application, the processor 51 may also be referred to as a CPU (Central Processing Unit). The processor 51 may be an integrated circuit chip having signal processing capabilities. The processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 51 may be any conventional processor or the like.
Please refer to fig. 14, fig. 14 is a schematic structural diagram of an embodiment of the computer storage medium provided in the present application, the computer storage medium 600 stores program data 61, and the program data 61 is executed by a processor to implement the distance measuring method of the above embodiment.
Embodiments of the present application may be implemented in software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, which is defined by the claims and the accompanying drawings, and the equivalents and equivalent structures and equivalent processes used in the present application and the accompanying drawings are also directly or indirectly applicable to other related technical fields and are all included in the scope of the present application.
Claims (11)
1. A distance measuring method based on a monocular camera is characterized in that the distance measuring method is applied to a robot, and the tail end of a mechanical arm of the robot is fixedly connected with the monocular camera;
the distance measuring method comprises the following steps:
acquiring a first image to be measured from a first visual field area of the monocular camera at a first position, and determining image coordinates of points to be measured in the first image to be measured;
acquiring relative position information between the first view area and the robot based on actual posture information of the mechanical arm;
and determining a first actual distance between the point to be measured and the robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured.
2. The distance measuring method according to claim 1, wherein the step of determining a first actual distance of the point to be measured from the robot comprises:
determining a first measurement distance of a projection optical axis of the point to be measured projected to the first visual field area based on the size information of the first visual field area, the size information of the first image to be measured and the image coordinates of the point to be measured;
determining a second measurement distance from a projection point of the point to be measured on the projection optical axis to the robot based on the relative position information of the first visual field area, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured;
determining a first actual distance between the point to be measured and the robot according to the first measurement distance and the second measurement distance;
the projection optical axis is a projection line of a main optical axis of the monocular camera on a plane where the first visual field area is located.
3. The distance measuring method according to claim 2,
the determining a second measurement distance from the projection point of the point to be measured on the projection optical axis to the robot based on the relative position information, the size information of the first image to be measured, the size information of the first visual field area, and the image coordinates of the point to be measured, includes:
determining a closest measured distance of the robot to the first field of view region based on the relative position information;
determining a third measurement distance from the projection point to an upper bottom edge of the first visual field area based on the size information of the first visual field area, the image coordinates of the point to be measured and the size information of the first image to be measured, wherein the upper bottom edge is an edge between the projection point and the robot;
and determining a second measurement distance from the projection point to the robot according to the nearest measurement distance and the third measurement distance.
4. The distance measuring method according to claim 2,
the determining a first measurement distance of the projection optical axis of the point to be measured projected to the first visual field region based on the size information of the first visual field region, the size information of the first image to be measured, and the image coordinates of the point to be measured includes:
determining a fourth measuring distance from the projection point to the side edge of the same side of the point to be measured in the first view field area based on the size information of the first view field area, the size information of the first image to be measured and the image coordinates of the point to be measured;
determining a proportional relation between the first measuring distance and the fourth measuring distance based on the image coordinates of the point to be measured and the size information of the first image to be measured;
and determining a first measurement distance from the point to be measured to a projection optical axis of the first visual field area according to the proportional relation and the fourth measurement distance.
5. The distance measuring method according to claim 2,
the distance measuring method further includes:
acquiring a second image to be measured from a second visual field area of the monocular camera at a second position, wherein the point to be measured is in the second image to be measured;
acquiring movement information of the robot moving from the first position to the second position;
and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance.
6. The distance measuring method according to claim 5,
the determining a second actual distance between the point to be measured and the robot at the second position according to the movement information and the first actual distance comprises:
acquiring a first direction from the point to be measured to the robot at the first position and a second direction from the point to be measured to the robot at the second position;
determining relative angle information of the first location and the second location based on the first direction and the second direction;
and determining a second actual distance between the point to be measured and the robot at the second position according to the movement information, the first actual distance and the relative angle information.
7. The distance measuring method according to claim 6,
the determining relative angle information of the first location and the second location based on the first direction and the second direction comprises:
acquiring a movement angle of the robot moving from the first position to the second position;
determining an included angle between a connecting line of the point to be measured and the robot and the projection optical axis based on the first measuring distance and the second measuring distance;
and determining relative angle information of the first position and the second position based on the movement angle and the included angle.
8. The distance measuring method according to claim 1,
the distance measuring method further comprises the following steps:
acquiring a third image to be measured from a third visual field area of the monocular camera at a third position, and determining image coordinates of other points to be measured in the third image to be measured, wherein the robot keeps still during the movement from the first position to the third position, and the position of the monocular camera relative to the robot changes;
acquiring updated relative position information between the third visual field area and the robot based on the updated actual posture information of the mechanical arm;
determining a third actual distance between the other points to be measured and the robot based on the updated relative position information, the size information of the third image to be measured, and the image coordinates of the other points to be measured.
9. A monocular camera-based distance measuring device, comprising: the device comprises an acquisition module, a position module and a measurement module; wherein, the first and the second end of the pipe are connected with each other,
the acquisition module is used for acquiring a first image to be measured from a first visual field area of the monocular camera at a first position and determining image coordinates of a point to be measured in the first image to be measured;
the position module is used for acquiring relative position information between the first view area and the robot based on actual posture information of the mechanical arm;
the measuring module is used for determining a first actual distance between the point to be measured and the robot based on the relative position information, the size information of the first image to be measured and the image coordinates of the point to be measured.
10. A monocular camera-based distance measuring device comprising a memory and a processor coupled to the memory;
wherein the memory is adapted to store program data and the processor is adapted to execute the program data to implement the distance measuring method according to any of claims 1-8.
11. A computer storage medium for storing program data for implementing a distance measurement method according to any one of claims 1 to 8 when executed by a computer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210919231.7A CN115401689B (en) | 2022-08-01 | 2022-08-01 | Distance measuring method and device based on monocular camera and computer storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210919231.7A CN115401689B (en) | 2022-08-01 | 2022-08-01 | Distance measuring method and device based on monocular camera and computer storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115401689A true CN115401689A (en) | 2022-11-29 |
CN115401689B CN115401689B (en) | 2024-03-29 |
Family
ID=84158789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210919231.7A Active CN115401689B (en) | 2022-08-01 | 2022-08-01 | Distance measuring method and device based on monocular camera and computer storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115401689B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117953043A (en) * | 2024-03-26 | 2024-04-30 | 北京云力境安科技有限公司 | Area measurement method and device based on endoscopic image and storage medium |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4314597A1 (en) * | 1993-05-04 | 1994-11-10 | Guido Dipl Ing Quick | Measuring arrangement for position determination in manipulators |
KR20080029080A (en) * | 2006-09-28 | 2008-04-03 | 부천산업진흥재단 | System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor |
EP2622576A1 (en) * | 2010-10-01 | 2013-08-07 | Saab AB | Method and apparatus for solving position and orientation from correlated point features in images |
CN105518702A (en) * | 2014-11-12 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Method, device and robot for detecting target object |
US20160107310A1 (en) * | 2014-10-17 | 2016-04-21 | Honda Motor Co., Ltd. | Controller for mobile robot |
CN109189060A (en) * | 2018-07-25 | 2019-01-11 | 博众精工科技股份有限公司 | The point-stabilized control method and device of mobile robot |
CN109483516A (en) * | 2018-10-16 | 2019-03-19 | 浙江大学 | A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint |
CN110178157A (en) * | 2016-10-07 | 2019-08-27 | 富士胶片株式会社 | Self-position estimation device, self-position estimation method, program and image processing apparatus |
CN110370286A (en) * | 2019-08-13 | 2019-10-25 | 西北工业大学 | Dead axle motion rigid body spatial position recognition methods based on industrial robot and monocular camera |
CN110749290A (en) * | 2019-10-30 | 2020-02-04 | 易思维(杭州)科技有限公司 | Three-dimensional projection-based characteristic information rapid positioning method |
WO2020063708A1 (en) * | 2018-09-28 | 2020-04-02 | 杭州海康威视数字技术股份有限公司 | Method, device and system for calibrating intrinsic parameters of fisheye camera, calibration device controller and calibration tool |
CN111982072A (en) * | 2020-07-29 | 2020-11-24 | 西北工业大学 | Target ranging method based on monocular vision |
EP3834997A1 (en) * | 2019-12-11 | 2021-06-16 | Carl Zeiss Industrielle Messtechnik GmbH | Method and device for calibrating a machine vision device for position determination |
CN113246145A (en) * | 2021-07-02 | 2021-08-13 | 杭州景业智能科技股份有限公司 | Pose compensation method and system for nuclear industry grabbing equipment and electronic device |
CN113496528A (en) * | 2021-09-07 | 2021-10-12 | 湖南众天云科技有限公司 | Method and device for calibrating position of visual detection target in fixed traffic roadside scene |
CN114055444A (en) * | 2021-08-27 | 2022-02-18 | 清华大学 | Robot, control method and control device thereof, calibration method and calibration control device thereof, and storage medium |
WO2022041737A1 (en) * | 2020-08-28 | 2022-03-03 | 北京石头世纪科技股份有限公司 | Distance measuring method and apparatus, robot, and storage medium |
CN114523472A (en) * | 2022-01-24 | 2022-05-24 | 湖南视比特机器人有限公司 | Workpiece cooperative grabbing method and system and storage medium |
CN114536292A (en) * | 2022-02-16 | 2022-05-27 | 中国医学科学院北京协和医院 | Error detection method based on composite identification and robot system |
CN114677410A (en) * | 2022-03-28 | 2022-06-28 | 杭州萤石软件有限公司 | Obstacle ranging method, mobile robot, equipment and medium |
CN114770461A (en) * | 2022-04-14 | 2022-07-22 | 深圳技术大学 | Monocular vision-based mobile robot and automatic grabbing method thereof |
-
2022
- 2022-08-01 CN CN202210919231.7A patent/CN115401689B/en active Active
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4314597A1 (en) * | 1993-05-04 | 1994-11-10 | Guido Dipl Ing Quick | Measuring arrangement for position determination in manipulators |
KR20080029080A (en) * | 2006-09-28 | 2008-04-03 | 부천산업진흥재단 | System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor |
EP2622576A1 (en) * | 2010-10-01 | 2013-08-07 | Saab AB | Method and apparatus for solving position and orientation from correlated point features in images |
US20160107310A1 (en) * | 2014-10-17 | 2016-04-21 | Honda Motor Co., Ltd. | Controller for mobile robot |
CN105518702A (en) * | 2014-11-12 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Method, device and robot for detecting target object |
CN110178157A (en) * | 2016-10-07 | 2019-08-27 | 富士胶片株式会社 | Self-position estimation device, self-position estimation method, program and image processing apparatus |
CN109189060A (en) * | 2018-07-25 | 2019-01-11 | 博众精工科技股份有限公司 | The point-stabilized control method and device of mobile robot |
WO2020063708A1 (en) * | 2018-09-28 | 2020-04-02 | 杭州海康威视数字技术股份有限公司 | Method, device and system for calibrating intrinsic parameters of fisheye camera, calibration device controller and calibration tool |
CN109483516A (en) * | 2018-10-16 | 2019-03-19 | 浙江大学 | A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint |
CN110370286A (en) * | 2019-08-13 | 2019-10-25 | 西北工业大学 | Dead axle motion rigid body spatial position recognition methods based on industrial robot and monocular camera |
CN110749290A (en) * | 2019-10-30 | 2020-02-04 | 易思维(杭州)科技有限公司 | Three-dimensional projection-based characteristic information rapid positioning method |
EP3834997A1 (en) * | 2019-12-11 | 2021-06-16 | Carl Zeiss Industrielle Messtechnik GmbH | Method and device for calibrating a machine vision device for position determination |
CN111982072A (en) * | 2020-07-29 | 2020-11-24 | 西北工业大学 | Target ranging method based on monocular vision |
WO2022041737A1 (en) * | 2020-08-28 | 2022-03-03 | 北京石头世纪科技股份有限公司 | Distance measuring method and apparatus, robot, and storage medium |
CN113246145A (en) * | 2021-07-02 | 2021-08-13 | 杭州景业智能科技股份有限公司 | Pose compensation method and system for nuclear industry grabbing equipment and electronic device |
CN114055444A (en) * | 2021-08-27 | 2022-02-18 | 清华大学 | Robot, control method and control device thereof, calibration method and calibration control device thereof, and storage medium |
CN113496528A (en) * | 2021-09-07 | 2021-10-12 | 湖南众天云科技有限公司 | Method and device for calibrating position of visual detection target in fixed traffic roadside scene |
CN114523472A (en) * | 2022-01-24 | 2022-05-24 | 湖南视比特机器人有限公司 | Workpiece cooperative grabbing method and system and storage medium |
CN114536292A (en) * | 2022-02-16 | 2022-05-27 | 中国医学科学院北京协和医院 | Error detection method based on composite identification and robot system |
CN114677410A (en) * | 2022-03-28 | 2022-06-28 | 杭州萤石软件有限公司 | Obstacle ranging method, mobile robot, equipment and medium |
CN114770461A (en) * | 2022-04-14 | 2022-07-22 | 深圳技术大学 | Monocular vision-based mobile robot and automatic grabbing method thereof |
Non-Patent Citations (1)
Title |
---|
高嵩等: "基于单目相机的三点式激光器测距及位姿估计方法研究", 《光学学报》, vol. 41, no. 9, pages 0915001 - 1 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117953043A (en) * | 2024-03-26 | 2024-04-30 | 北京云力境安科技有限公司 | Area measurement method and device based on endoscopic image and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN115401689B (en) | 2024-03-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10825198B2 (en) | 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images | |
WO2021139590A1 (en) | Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor | |
US20210041236A1 (en) | Method and system for calibration of structural parameters and construction of affine coordinate system of vision measurement system | |
US7526121B2 (en) | Three-dimensional visual sensor | |
CN112880642B (en) | Ranging system and ranging method | |
US20120268567A1 (en) | Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium | |
JP6324025B2 (en) | Information processing apparatus and information processing method | |
KR20200085670A (en) | Method for calculating a tow hitch position | |
CN112013858B (en) | Positioning method, positioning device, self-moving equipment and storage medium | |
CN112013850B (en) | Positioning method, positioning device, self-moving equipment and storage medium | |
Xia et al. | Global calibration of non-overlapping cameras: State of the art | |
WO2023185250A1 (en) | Obstacle distance measurement method, mobile robot, device and medium | |
US11222433B2 (en) | 3 dimensional coordinates calculating apparatus and 3 dimensional coordinates calculating method using photo images | |
US20210374978A1 (en) | Capturing environmental scans using anchor objects for registration | |
US20190080471A1 (en) | Distance measurement system and distance measurement method | |
CN115401689A (en) | Monocular camera-based distance measuring method and device and computer storage medium | |
US20190285404A1 (en) | Noncontact three-dimensional measurement system | |
CN113744340A (en) | Calibrating cameras with non-central camera models of axial viewpoint offset and computing point projections | |
JP3842988B2 (en) | Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program | |
CN114838702A (en) | Distance measuring method, electronic device, and storage medium | |
JP4227037B2 (en) | Imaging system and calibration method | |
JP2015135333A (en) | Information processing device, control method for information processing device, and program | |
JP2021038939A (en) | Calibration device | |
WO2020215296A1 (en) | Line inspection control method for movable platform, and line inspection control device, movable platform and system | |
Liu et al. | Snapshot: A self-calibration protocol for camera sensor networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |