CN208937054U - Positioning navigation system and robot based on two-dimensional code - Google Patents

Positioning navigation system and robot based on two-dimensional code Download PDF

Info

Publication number
CN208937054U
CN208937054U CN201820085791.6U CN201820085791U CN208937054U CN 208937054 U CN208937054 U CN 208937054U CN 201820085791 U CN201820085791 U CN 201820085791U CN 208937054 U CN208937054 U CN 208937054U
Authority
CN
China
Prior art keywords
dimensional code
positioning
sub
image
code pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820085791.6U
Other languages
Chinese (zh)
Inventor
吕江
胡东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gangwan Intelligent Technology Suzhou Co ltd
Original Assignee
Water Rock Intelligent Technology Ningbo Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Water Rock Intelligent Technology Ningbo Co ltd filed Critical Water Rock Intelligent Technology Ningbo Co ltd
Priority to CN201820085791.6U priority Critical patent/CN208937054U/en
Application granted granted Critical
Publication of CN208937054U publication Critical patent/CN208937054U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the disclosure relates to a two-dimension code positioning label, a positioning navigation system based on a two-dimension code and a robot. The two-dimensional code positioning label comprises a plurality of sub two-dimensional code patterns stored with position identification information. The positioning navigation system based on the two-dimensional code comprises: the device comprises a plurality of positioning mark points, a plurality of positioning labels, an image acquisition device, an image processing device and a control device. The embodiment of the disclosure can identify the positioning mark point based on the two-dimensional code pattern array, decode the sub two-dimensional code in the image to obtain the related position information by collecting the positioning label image at the positioning mark point, and complete the navigation type movement based on the position information. The technical scheme provided by the embodiment of the disclosure can accurately acquire the positioning label image, and can efficiently and accurately identify the position so as to realize accurate positioning navigation.

Description

Positioning navigation system and robot based on two-dimensional code
Technical Field
The utility model belongs to the technical field of fix a position, especially, relate to a two-dimensional code location label, location navigation based on two-dimensional code and robot.
Background
Currently, the existing positioning and navigation technologies of a mobile robot include Simultaneous localization and Mapping (SLAM) instant positioning and Mapping technology, magnetic navigation, GPS, image recognition navigation, inertial navigation, optical navigation, electromagnetic navigation, direct coordinate navigation, RFID positioning navigation, laser navigation, and the like. However, in consideration of factors in aspects of technical maturity, implementation accuracy, specific scenes, cost and the like, image recognition navigation, especially positioning navigation technology based on two-dimensional code image recognition, is widely applied in industry.
In the existing two-dimensional code positioning and navigation technology, a mobile robot needs to collect a two-dimensional code image first, but because the robot usually collects a two-dimensional code in the moving process, the robot cannot be aligned with the two-dimensional code image to collect the two-dimensional code image due to errors such as mechanical errors, calculation errors and road surface errors, and meanwhile, due to reasons such as uneven ground and stained images, various problems exist in the image collected by the mobile robot, for example, the image is blurred, the two-dimensional code image is incomplete, and therefore the mobile robot cannot correctly process and decode the two-dimensional code image to acquire relevant information of positioning and navigation, navigation failure is caused, and the operating efficiency of the system is seriously affected.
SUMMERY OF THE UTILITY MODEL
The purpose of the disclosure is to overcome the defects of the prior art, and provide a two-dimensional code positioning tag, a two-dimensional code-based positioning navigation system and a robot capable of realizing accurate positioning.
According to an aspect of this disclosure, a two-dimensional code positioning tag is provided, the two-dimensional code positioning tag is disposed at a positioning mark point, the two-dimensional code positioning tag includes:
the position identification information comprises a plurality of sub two-dimensional code patterns which store position identification information, wherein the sub two-dimensional code patterns form an n-m-dimensional two-dimensional code pattern array, and n and m are positive integers;
the geometric center of the two-dimensional code pattern array is coincided with the center of the positioning mark point;
the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graph for externally surrounding the sub two-dimensional code.
Optionally, the outer contours of the sub two-dimensional code patterns are equal in size or different in size, and the row spacing and the column spacing between every two sub two-dimensional code patterns are equal in size or different in size.
Optionally, the auxiliary graphic is a closed graphic or a pattern.
Optionally, the auxiliary pattern is a square frame.
Optionally, the two-dimensional code positioning tag further includes identification information and a reference pattern, wherein:
the identification information is arranged in the two-dimensional code positioning tag and is used for recording position additional information;
the reference datum graph is arranged on the periphery of the two-dimensional code positioning label and used for providing a position datum when the two-dimensional code positioning label is arranged at a positioning mark point.
According to an aspect of the present disclosure, a two-dimensional code based positioning and navigation system is provided, which includes: a plurality of location mark points, a plurality of location label, image acquisition device, image processing apparatus and controlling means, wherein:
the positioning mark points are respectively arranged in the area to be positioned, and each positioning mark point is correspondingly provided with unique identification position information and is used for marking a fixed position in the area to be positioned;
the positioning labels are arranged corresponding to the positioning mark points, and two-dimensional code patterns are arranged on the positioning labels and used for bearing identification information to realize navigation and yaw correction on the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark points;
the image acquisition device is used for acquiring an image containing the two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the moving object in the region to be positioned;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the moving object in the area to be positioned and sending the moving instruction to the moving object.
Optionally, the positioning mark points are uniformly or non-uniformly distributed in the area to be positioned.
Optionally, the two-dimensional code pattern is an n × m-dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, where n and m are positive integers.
Optionally, the geometric center of the two-dimensional code pattern array coincides with the center of the positioning mark point.
Optionally, the outer contours of the sub two-dimensional code patterns are equal in size or different in size, and the row spacing and the column spacing between every two sub two-dimensional code patterns are equal in size or different in size.
Optionally, the sub two-dimensional code pattern includes a sub two-dimensional code and an auxiliary graph externally surrounding the sub two-dimensional code.
Optionally, the auxiliary graphic is a closed graphic or a pattern.
Optionally, the two-dimensional code positioning tag further includes identification information and a reference pattern, wherein:
the identification information is arranged in the two-dimensional code positioning tag and is used for recording position additional information;
the reference datum graph is arranged on the periphery of the two-dimensional code positioning label and used for providing a position datum when the two-dimensional code positioning label is arranged at a positioning mark point.
According to an aspect of the present disclosure, there is provided a mobile robot including: image acquisition device, image processing apparatus, controlling means and mobile device, wherein:
the image acquisition device is used for acquiring an image containing a two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the mobile robot;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the mobile robot and sending the moving instruction to the mobile device;
and the mobile device moves according to the movement instruction.
Optionally, the two-dimensional code pattern is an n × m-dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, where n and m are positive integers.
Optionally, the sub two-dimensional code pattern includes a sub two-dimensional code and an auxiliary graph externally surrounding the sub two-dimensional code.
Optionally, the two-dimensional code pattern array is disposed on a positioning tag, the positioning tag is disposed corresponding to a positioning mark point in an area where the mobile robot is located, and the positioning mark point is correspondingly provided with unique identification position information.
Optionally, the geometric center of the two-dimensional code pattern array coincides with the center of the positioning mark point.
Optionally, the positioning tag further comprises identification information and a reference pattern, wherein:
the identification information is arranged in the positioning label and is used for recording position additional information;
the reference datum pattern is arranged on the periphery of the positioning label and used for providing a position datum when the positioning label is arranged at the positioning mark point.
Optionally, the optical axis of the image acquisition device passes through a geometric center of the mobile robot in a vertical plane.
The embodiment of the disclosure can identify the positioning mark point based on the two-dimensional code pattern array, decode the sub two-dimensional code in the image to obtain the related position information by collecting the positioning label image at the positioning mark point, and complete the navigation type movement based on the position information. The technical scheme provided by the embodiment of the disclosure can accurately acquire the positioning label image, and can efficiently and accurately identify the position so as to realize accurate positioning navigation.
Drawings
The above and other features and advantages will become more apparent to those of ordinary skill in the art by describing in detail exemplary embodiments with reference to the accompanying drawings, in which:
fig. 1A is a schematic diagram of a moving object moving area according to an embodiment of the present disclosure;
FIGS. 1B and 1C are schematic diagrams of marker point settings according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a two-dimensional code based positioning navigation system according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a location tag according to an embodiment of the present disclosure;
fig. 4 is a flowchart of a positioning and navigation method based on two-dimensional codes according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of offset and offset angle calculations according to an embodiment of the present disclosure;
fig. 6 is a flowchart of a positioning and navigation method based on two-dimensional codes according to another embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions and advantages of the present disclosure more clear, the following detailed description of the present disclosure is made with reference to the accompanying drawings and specific embodiments.
Generally, in industrial applications of various scenes, a mobile robot usually moves in a working area with limited space, in this disclosure, the area in which the mobile robot moves is referred to as a working area, and a robot moving on a two-dimensional plane has a working area referred to as a moving plane; a robot moving in a three-dimensional space has a working area called a movement space. As shown in fig. 1A, if the mobile robot 101 moves only in the plane defined by the coordinate axis xoy in a three-dimensional space, the range defined by its moving area is called a moving plane, as shown in fig. 1B and 1C, and in fig. 1B and 1C, the shaded area represents the moving plane of the mobile robot; if the mobile robot 101 moves in the direction of the coordinate axis z, i.e. includes 3 dimensions, in addition to moving in the plane defined by the coordinate axis xoy, the range defined by its movement area is called a movement space.
The working area can be a warehouse ground plane or a warehouse space of a warehouse, can also be a workshop ground plane or a workshop space and other areas of a production workshop, and belongs to a working area of a single application scene; or a moving plane or a moving space formed by communicating a production workshop and a storage warehouse together, which belongs to a working area of a composite application scene; or a working area formed by combining a plurality of moving planes and moving spaces, such as moving planes and moving spaces of different floors connected together by elevators, other transmission systems, etc.
The two-dimensional moving plane is taken as an example to explain and explain a specific implementation manner of the positioning and navigation system based on the two-dimensional code and the mobile robot for positioning and navigation based on two-dimensional code identification, and the technical principle of the positioning and navigation system on the two-dimensional moving plane can be transferred to a three-dimensional moving space, for example, a three-dimensional space where the mobile robot works can be defined by moving planes which are orthogonal to each other, so that the positioning and navigation based on the two-dimensional code in the three-dimensional moving space can be realized.
According to an aspect of the present disclosure, a two-dimensional code based positioning and navigation system is provided, as shown in fig. 2, the two-dimensional code based positioning and navigation system includes: a plurality of positioning mark points 201, a plurality of positioning labels 202, an image acquisition device 203, an image processing device 204 and a control device 205, wherein:
the positioning mark points are respectively arranged in the area to be positioned, and each positioning mark point is correspondingly provided with unique identification position information and is used for marking a fixed position in the area to be positioned;
the positioning labels are arranged corresponding to the positioning mark points, and two-dimensional code patterns are arranged on the positioning labels and used for bearing identification information to realize navigation and yaw correction on the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark points;
the image acquisition device is used for acquiring an image containing the two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the moving object in the region to be positioned;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the moving object in the area to be positioned and sending the moving instruction to the moving object.
The moving object may be a mobile robot or other moving objects that need to be moved by navigation.
The positioning and navigation system is further described below by taking a mobile robot as an example. As shown in fig. 1B and 1C, the plurality of positioning mark points 103 are respectively arranged in the moving area of the mobile robot, and each positioning mark point is correspondingly provided with unique identification position information. In order to identify the position information of the positioning mark points, a real coordinate system xoy is first established corresponding to the movement plane 102 defined by the movement area of the mobile robot, and then a plurality of mark points are set under the real coordinate system, where the mark points may be uniformly distributed as shown in fig. 1B or non-uniformly distributed as shown in fig. 1C, for example, the mark points may be tightly set in some local areas as shown by the dashed box area in fig. 1C or loosely set in some local areas. The marker point of the plurality of marker points that is within the movement plane 102 is the positioning marker point 103 mentioned above. The position serial numbers of the positioning mark points 103 in the x-axis and y-axis directions of the real coordinate system xoy can be used as the real coordinate values of the positioning mark points, i.e. the unique identification position information, for example, the real coordinate values of the positioning mark points 103 marked in fig. 1C are (3, 2), and after the positioning mark points 3 are marked with the fixed position information in the moving area, the mobile robot can move along the path formed by the positioning mark points 3, i.e. the navigation of the moving object is realized.
The positioning tags and the positioning mark points are correspondingly arranged, for example, one positioning tag is arranged corresponding to each positioning mark point, and a two-dimensional code pattern which can be identified by a mobile robot and is formed by sub two-dimensional code patterns is arranged on the positioning tags.
In an embodiment of the present disclosure, the positioning tag and the positioning mark point are correspondingly arranged in such a manner that a geometric center of the two-dimensional code pattern array arranged on the positioning tag coincides with a center of the positioning mark point, that is, no matter how the shape of the positioning tag itself is, no matter whether a size deviation occurs in a manufacturing process of the positioning tag or whether a deviation occurs in a center of the two-dimensional code pattern array carried by the positioning tag itself, the geometric center of the two-dimensional code pattern array carried by the positioning tag and the center of the positioning mark point should be kept coincident.
In an embodiment of the present disclosure, the outer contour size of each sub two-dimensional code pattern constituting the two-dimensional code pattern array is equal, and the sub two-dimensional code patterns are spaced apart by an equal row spacing d1 and an equal column spacing d 2. Of course, in other embodiments of the present disclosure, the outline sizes of the sub two-dimensional code patterns may not be equal, and the sub two-dimensional code patterns may not be equidistant from each other, and those skilled in the art may arrange and place the sub two-dimensional code patterns according to the needs of practical application, which is not particularly limited by the present disclosure.
Fig. 3 is a schematic diagram of a positioning tag according to an embodiment of the present disclosure, as shown in fig. 3, the two-dimensional code pattern carried by the positioning tag 302 is a3 × 3 sub two-dimensional code pattern array, in the array, each sub two-dimensional code pattern includes a sub two-dimensional code 305 and an auxiliary pattern 306 externally enclosing the sub two-dimensional code, and the auxiliary pattern 306 is used to assist a mobile robot to identify the sub two-dimensional code 305 more efficiently and accurately, where the auxiliary pattern 306 may be any shape, and certainly, in order to enable the mobile robot to identify the sub two-dimensional code 305 more efficiently, the auxiliary pattern 306 may be set to be a square frame with side length of l, and at this time, an outline of each sub two-dimensional code pattern refers to an outline formed by the auxiliary pattern 306. In fig. 3, the row pitch between each sub two-dimensional code pattern is d2, and the column pitch is d 1. In addition, the positioning label is further provided with identification information 307 and a reference pattern 308, the identification information 307 is used for recording some additional information, for example, information which can be used for accurately identifying the position of the positioning mark point, such as the real coordinate value of the corresponding positioning mark point, and the like, the identification information 307 can be placed at any position in the positioning label, such as a central point, a blank, and the like, and the reference pattern 308 is arranged at the periphery of the positioning label and used for providing a position reference when the positioning label is arranged at the positioning mark point 103 so as to prevent the geometric center of the two-dimensional code pattern array from being shifted from the center of the positioning mark point 103.
As mentioned above, the auxiliary pattern 306 is not limited to a square shape, and may be other patterns that can be used as the periphery of the sub two-dimensional code pattern, such as a closed pattern or a pattern like a circle or an ellipse, or a non-closed pattern or a pattern like an L-shape or an inverted L-shape, as long as it can assist the mobile robot to recognize the complete pattern of the sub two-dimensional code 305.
In each two-dimensional code image array with n rows and m columns, a sub two-dimensional code pattern 305 located in the ith row and the jth column correspondingly stores a group of identification information, where the identification information includes: the real coordinate value of the positioning mark point corresponding to the sub two-dimensional code pattern, the side length n and m of the two-dimensional code pattern array, the subscript value i × m + j of the sub two-dimensional code pattern, the side length l of the sub two-dimensional code pattern, and one or more information of the row spacing d1 and the column spacing d2 between the sub two-dimensional code patterns.
In a specific embodiment, the information stored in the sub two-dimensional code pattern can be represented as an array: (1,1,3,3,4,30,30), the array representing: the real coordinate values of the positioning mark points corresponding to the current sub two-dimensional code patterns are (1,1), the side lengths of the two-dimensional code pattern arrays are 3 and 3, the subscript values of the sub two-dimensional codes are 4, the side lengths of the sub two-dimensional code patterns are 30mm, and the line spacing and the column spacing between every two adjacent sub two-dimensional code patterns are 30 mm.
The image acquisition device can be a camera, a camera and other devices capable of acquiring images.
The image acquisition device, the image processing device and the control device can be arranged independently of the moving object or can be integrated on the moving object. By utilizing the positioning navigation system and based on the positioning mark points marked by the two-dimensional code pattern array, the mobile robot can collect the positioning label images at the positioning mark points, decode the sub two-dimensional codes in the images to obtain the related position information, and complete the navigation type movement based on the position information.
According to another aspect of the present disclosure, a two-dimensional code based positioning and navigation method is provided, as shown in fig. 4, the two-dimensional code based positioning and navigation method includes the following steps: step 401, collecting an image in a moving area of a moving object to obtain an image to be processed containing a two-dimensional code pattern;
the mobile area is provided with a plurality of positioning mark points and a plurality of positioning labels correspondingly arranged with the positioning mark points, and each positioning mark point is correspondingly provided with unique identification position information used for marking a fixed position in the area to be positioned; the positioning label is provided with a two-dimensional code pattern for bearing identification information to realize navigation and yaw correction of the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark point, the pattern is an n-m-dimensional two-dimensional code pattern array formed by a plurality of sub two-dimensional code patterns, and n and m are positive integers.
In an embodiment of the present disclosure, the outer contour size of each sub two-dimensional code pattern constituting the two-dimensional code pattern array is equal, and the sub two-dimensional code patterns are spaced apart by an equal row spacing d1 and an equal column spacing d 2. Of course, in other embodiments of the present disclosure, the outline sizes of the sub two-dimensional code patterns may not be equal, and the sub two-dimensional code patterns may not be equidistant from each other, and those skilled in the art may arrange and place the sub two-dimensional code patterns according to the needs of practical application, which is not particularly limited by the present disclosure.
In an embodiment of the present disclosure, each sub two-dimensional code pattern includes a sub two-dimensional code 305 and an auxiliary graph 306 externally enclosing the sub two-dimensional code, where the auxiliary graph 306 is used to assist the mobile robot to identify the sub two-dimensional code 305 more efficiently and accurately, where the auxiliary graph 306 may be in any shape, and certainly, in order to enable the mobile robot to identify the sub two-dimensional code 305 more efficiently, the auxiliary graph 306 may be set to be a square frame with a side length of l.
In an embodiment of the present disclosure, the positioning tag is further provided with an identification information 307 and a reference pattern 308, the identification information 307 is used to record some additional information, for example, information that helps to accurately identify the position of the positioning mark point, such as a real coordinate value of the corresponding positioning mark point, the identification information 307 can be placed at any position in the positioning tag, such as a central point, a blank, and the like, and the reference pattern 308 is provided at the periphery of the positioning tag and is used to provide a position reference when the positioning tag is placed at the positioning mark point 103, so as to prevent the geometric center of the two-dimensional code pattern array from being shifted from the center of the positioning mark point 103.
The moving object may be a mobile robot or other moving objects that need to be moved by navigation.
Wherein, the image in the moving area of the moving object can be collected in a video stream mode.
In an embodiment of the present disclosure, each frame of image acquired through the video stream is taken as an image to be processed.
In another embodiment of the present disclosure, one frame of image in a plurality of frames of images acquired through a video stream with a preset length is used as an image to be processed, assuming that the length of the video stream is k, that is, k frames of images can be acquired each time, and then one frame of image is selected from the k frames of images as the image to be processed for subsequent processing.
Step 402, performing image preprocessing on an image to be processed, and determining whether a complete sub two-dimensional code pattern is included in the image to be processed? If so, continuing to step 403, otherwise, repeating step 402, preprocessing the next image to be processed until the image to be processed contains the complete sub-two-dimensional code pattern, and continuing to step 403;
wherein the image pre-processing may comprise: one or more of erosion, dilation, edge detection, contour extraction, etc.
It should be noted that, in the step 401, each frame of image acquired through the video stream is taken as an image to be processed, and then, in the step 402, if the previous frame of image is being preprocessed, discarding processing is performed on the next frame of image that is input until the current frame of image is processed, and then the input image to be processed cannot be preprocessed.
In an embodiment of the present disclosure, the step 402 further includes a step of extracting a sub two-dimensional code pattern from the image to be processed: an image block 509 containing the sub two-dimensional code pattern is extracted in a manner of being tangent to the outer rectangle of the auxiliary graphic of the sub two-dimensional code pattern, as shown in fig. 5.
Step 403, decoding the complete sub two-dimensional code pattern to obtain the identification information stored in the sub two-dimensional code pattern;
as mentioned above, in each n rows and m columns of the two-dimensional code image array, the sub two-dimensional code pattern 305 located in the ith row and jth column correspondingly stores a set of identification information, which includes: the real coordinate value of the positioning mark point corresponding to the sub two-dimensional code pattern, the side length n and m of the two-dimensional code pattern array, the subscript value i × m + j of the sub two-dimensional code pattern, the side length l of the sub two-dimensional code pattern, and one or more of the row spacing d1 and the column spacing d2 between the sub two-dimensional code patterns.
Step 404, calculating the offset of the moving object relative to the center of the two-dimensional code pattern array in the positioning label corresponding to the positioning mark point and the offset angle of the moving object relative to the real coordinate system of the moving area by using the identification information stored in the sub two-dimensional code pattern and the information of the sub two-dimensional code pattern in the current image to be processed;
in an embodiment of the present disclosure, a contour center point of the moving object on the vertical plane coincides with a center point of the image to be processed in the moving plane direction, and therefore, the offset refers to an offset of the center point of the current image to be processed with respect to a center of the two-dimensional code pattern array in the positioning tag corresponding to the positioning mark point.
Fig. 5 is a schematic diagram illustrating offset and offset angle calculation according to an embodiment of the disclosure, as shown in fig. 5, an input image to be processed 510 includes 4 complete sub two-dimensional code patterns, and the complete sub two-dimensional code pattern currently performing decoding processing is the sub two-dimensional code pattern located at the upper left corner of the image to be processed 510.
As shown in fig. 5, a coordinate system x1o1y1 is established with the center of the two-dimensional code pattern array as an origin, and a coordinate system x2o2y2 is established with one corner point of the current image to be processed 510 as an origin, where an included angle between the coordinate system x1o1y1 and the real coordinate system xoy is represented as n × pi/2, and n is a natural number, which may be 0, 1, 2, or 3, for example. The offsets are expressed as offsets Δ x and Δ y of the center point 511 of the current image to be processed 510 on the x1 axis and the y1 axis with respect to the center o1 of the two-dimensional code pattern array, that is, coordinate values on the x1 axis and the y1 axis; the offset angle is represented as an offset angle between the coordinate axis x2y2 of the current image to be processed 510 and the corresponding coordinate axis under the real coordinate system xoy.
When the offset and the offset angle are calculated, firstly, intercepting an image block 509 containing the sub two-dimensional code pattern in a circumscribed rectangle mode, wherein the image block 509 is axially parallel to the image to be processed 510; then, the position information of the center of the sub two-dimensional code pattern in the image block 509 is obtained by using the pixel value relationship; obtaining the position relationship between the center of the two-dimensional code image and the central point 511 of the image to be processed 510 under the current image to be processed coordinate system x2o2y2 by using the position relationship between the image block 509 and the image to be processed 510; the position information of the center of the sub two-dimensional code pattern in a coordinate system x1o1y1 of the two-dimensional code pattern array can be obtained by utilizing the identification information stored in the sub two-dimensional code pattern; through the coordinate transformation, the position information of the central point 511 of the image to be processed 510 in the coordinate system x1o1y1 can be obtained, so that the offset amounts Δ x and Δ y of the central point 511 of the image to be processed 510 with respect to the coordinate origin o1 of the coordinate system x1o1y1 and the offset angle θ of the image frame 510 to be processed with respect to the coordinate system x1o1y1 are obtained. Since the origin o1 of the coordinate system x1o1y1 is the center of the two-dimensional code pattern array and the angle between the coordinate system x1o1y1 and the real coordinate system xoy is a fixed value n x pi/2, the offsets Δ x and Δ y and the offset angle θ can reflect the position relationship between the moving object and the center of the positioning mark point and the angular offset of the moving object relative to the real coordinate system.
Step 405, based on the offset amount and the offset angle, a moving instruction is made to the moving object.
Further, the offset and the offset angle may be expressed as a physical length value and an angle value corresponding to a specific physical length unit, and then a move instruction is made based on the physical length value and the angle value.
For example, the correspondence between the pixels and the physical length/physical width is obtained through the actual physical length value, the physical width value, and the number of corresponding pixels known per se of the two-dimensional code pattern, so as to obtain the physical length/physical width value of the offset.
According to another aspect of the present disclosure, a two-dimensional code based positioning and navigation method is provided, as shown in fig. 6, the two-dimensional code based positioning and navigation method includes the following steps:
step 601, collecting images in a moving area of a moving object to obtain an image to be processed containing a two-dimensional code pattern;
the mobile area is provided with a plurality of positioning mark points and a plurality of positioning labels correspondingly arranged with the positioning mark points, and each positioning mark point is correspondingly provided with unique identification position information used for marking a fixed position in the area to be positioned; the positioning label is provided with a two-dimensional code pattern for bearing identification information to realize navigation and yaw correction of the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark point, the pattern is an n-m-dimensional two-dimensional code pattern array formed by a plurality of sub two-dimensional code patterns, and n and m are positive integers.
In an embodiment of the present disclosure, the outer contour size of each sub two-dimensional code pattern constituting the two-dimensional code pattern array is equal, and the sub two-dimensional code patterns are spaced apart by an equal row spacing d1 and an equal column spacing d 2. Of course, in other embodiments of the present disclosure, the outline sizes of the sub two-dimensional code patterns may not be equal, and the sub two-dimensional code patterns may not be equidistant from each other, and those skilled in the art may arrange and place the sub two-dimensional code patterns according to the needs of practical application, which is not particularly limited by the present disclosure.
In an embodiment of the present disclosure, each sub two-dimensional code pattern includes a sub two-dimensional code 305 and an auxiliary graph 306 externally enclosing the sub two-dimensional code, where the auxiliary graph 306 is used to assist the mobile robot to identify the sub two-dimensional code 305 more efficiently and accurately, where the auxiliary graph 306 may be in any shape, and certainly, in order to enable the mobile robot to identify the sub two-dimensional code 305 more efficiently, the auxiliary graph 306 may be set to be a square frame with a side length of l.
In an embodiment of the present disclosure, the positioning tag is further provided with an identification information 307 and a reference pattern 308, the identification information 307 is used to record some additional information, for example, information that helps to accurately identify the position of the positioning mark point, such as a real coordinate value of the corresponding positioning mark point, the identification information 307 can be placed at any position in the positioning tag, such as a central point, a blank, and the like, and the reference pattern 308 is provided at the periphery of the positioning tag and is used to provide a position reference when the positioning tag is placed at the positioning mark point 103, so as to prevent the geometric center of the two-dimensional code pattern array from being shifted from the center of the positioning mark point 103.
The moving object may be a mobile robot or other moving objects that need to be moved by navigation.
Wherein, the image in the moving area of the moving object can be collected in a video stream mode.
In an embodiment of the present disclosure, each frame of image acquired through the video stream is taken as an image to be processed.
In another embodiment of the present disclosure, one frame of image in a plurality of frames of images acquired through a video stream with a preset length is used as an image to be processed, assuming that the length of the video stream is k, that is, k frames of images can be acquired each time, and then one frame of image is selected from the k frames of images as the image to be processed for subsequent processing.
Step 602, performing image preprocessing on an image to be processed, and determining whether a complete sub two-dimensional code pattern is included in the image to be processed? If yes, continuing to step 603, otherwise, repeating step 602, preprocessing the next image to be processed until the image to be processed contains a complete sub two-dimensional code pattern, and continuing to step 603;
wherein the image pre-processing may comprise: one or more of erosion, dilation, edge detection, contour extraction, etc.
It should be noted that, in the step 601, each frame of image acquired through the video stream is taken as an image to be processed, in step 602, if the previous frame of image is being preprocessed, discarding processing is performed on the next frame of image that is input, and the preprocessing cannot be performed on the image to be processed that is input until the current frame of image is processed.
In an embodiment of the present disclosure, the step 602 further includes a step of extracting a sub two-dimensional code pattern from the image to be processed: an image block 509 containing the sub two-dimensional code pattern is extracted in a manner of being tangent to the outer rectangle of the auxiliary graphic of the sub two-dimensional code pattern, as shown in fig. 5.
Step 603, determine whether the number of the acquired complete sub two-dimensional code patterns is greater than 1? If yes, go to step 604, otherwise go to step 605;
step 604, selecting a sub two-dimensional code pattern which is not decoded from a plurality of complete sub two-dimensional code patterns;
the sub two-dimensional code patterns which are not decoded can be selected by randomly selecting, selecting the middle sub two-dimensional code pattern or selecting the first/last sub two-dimensional code pattern.
In an embodiment of the present disclosure, when selecting the sub two-dimensional code pattern on which decoding is not performed, the sub two-dimensional code pattern on which decoding is performed and the sub two-dimensional code pattern on which decoding is not performed may be distinguished first, and one of the sub two-dimensional code patterns on which decoding is not performed is selected; or deleting the sub two-dimensional code pattern which is decoded from the sub two-dimensional code pattern set after each decoding operation, thereby ensuring that only the sub two-dimensional code pattern which is not decoded can be selected when the sub two-dimensional code pattern is selected again.
605, decoding the sub two-dimension code pattern to obtain the identification information stored in the sub two-dimension code pattern;
as mentioned above, in each n rows and m columns of the two-dimensional code image array, the sub two-dimensional code pattern 305 located in the ith row and jth column correspondingly stores a set of identification information, which includes: the real coordinate value of the positioning mark point corresponding to the sub two-dimensional code pattern, the side length n and m of the two-dimensional code pattern array, the subscript value i × m + j of the sub two-dimensional code pattern, the side length l of the sub two-dimensional code pattern, and one or more information of the row spacing d1 and the column spacing d2 between the sub two-dimensional code patterns.
Step 606, determine whether the decoding of the sub two-dimensional code pattern is successful? If yes, go to step 608, otherwise go to step 607;
in step 607, determine if all the complete sub-two-dimensional code patterns have been decoded? If yes, returning to step 602, and performing preprocessing on the next image to be processed; otherwise, returning to the step 604, selecting a complete sub two-dimensional code pattern which is not decoded;
step 608, calculating the offset of the moving object relative to the center of the two-dimensional code pattern array in the positioning tag corresponding to the positioning mark point and the offset angle of the moving object relative to the real coordinate system where the moving area is located, by using the identification information stored in the sub two-dimensional code pattern and the information of the sub two-dimensional code pattern in the current image to be processed;
the calculation of the specific offset and offset angle has been described in detail above and will not be described further here.
Step 609, based on the offset and the offset angle, a moving instruction is made to the moving object.
Further, the offset and the offset angle may be expressed as a physical length value and an angle value corresponding to a specific physical length unit, and then a move instruction is made based on the physical length value and the angle value.
For example, the correspondence between the pixels and the physical length/physical width is obtained through the actual physical length value, the physical width value, and the number of corresponding pixels known per se of the two-dimensional code pattern, so as to obtain the physical length/physical width value of the offset. According to another aspect of the present disclosure, a two-dimensional code positioning tag is provided, where the two-dimensional code positioning tag is disposed at a positioning mark point, and the two-dimensional code positioning tag includes:
the position identification information comprises a plurality of sub two-dimensional code patterns which store position identification information, wherein the sub two-dimensional code patterns form an n-m-dimensional two-dimensional code pattern array, and n and m are positive integers;
the geometric center of the two-dimensional code pattern array is coincided with the center of the positioning mark point;
the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graph for externally surrounding the sub two-dimensional code.
The above-mentioned features of the two-dimensional code positioning tag and each component have been described in detail, and are not described herein again.
According to another aspect of the present disclosure, there is provided a mobile robot including: image acquisition device, image processing apparatus, controlling means and mobile device, wherein:
the image acquisition device is used for acquiring an image containing a two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the mobile robot;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the mobile robot and sending the moving instruction to the mobile device;
and the mobile device moves according to the movement instruction.
Wherein the optical axis of the image acquisition device passes through the geometric center of the mobile robot on a vertical plane.
The mobile device can be a mobile device such as a hand, a foot and a bottom wheel of the mobile robot.
The features described above with respect to the mobile robot and the various other components have been described in detail and are not repeated here.
According to another aspect of the present disclosure, a method for calculating a movement deviation based on a two-dimensional code is provided, the method comprising the steps of:
intercepting an image block containing a sub two-dimensional code pattern in a circumscribed rectangle mode, wherein the image block is axially parallel to an image to be processed, the image to be processed comprises the two-dimensional code pattern, the two-dimensional code pattern is an n x m-dimensional two-dimensional code pattern array formed by a plurality of sub two-dimensional code patterns, and n and m are positive integers;
obtaining the position information of the center of the sub two-dimensional code pattern in the image block by utilizing the pixel value relation;
obtaining the position relation between the center of the two-dimensional code image and the center point of the image to be processed under the current coordinate system of the image to be processed by utilizing the position relation between the image block and the image to be processed;
obtaining the position information of the center of the sub two-dimensional code pattern in the coordinate system of the two-dimensional code pattern array by utilizing the identification information stored in the sub two-dimensional code pattern;
and obtaining the position information of the central point of the image to be processed in the coordinate system through coordinate transformation, and further obtaining the offset of the central point of the image to be processed relative to the coordinate origin of the sub two-dimensional code pattern array coordinate system and the offset angle of the image frame to be processed relative to the sub two-dimensional code pattern array coordinate system.
The above-described method steps and the features of the components involved have been described in detail and are not described in detail here.
According to another aspect of the present disclosure, an electronic device is presented, comprising a memory and a processor; wherein,
the memory is used to store one or more computer instructions, which are executed by the processor to implement the above-described method steps.
According to yet another aspect of the present disclosure, a computer-readable storage medium is also presented, having stored thereon computer instructions, which, when executed by a processor, implement the above-mentioned method steps.
The above-listed detailed description is merely a specific description of possible embodiments of the present disclosure, and is not intended to limit the scope of the disclosure, which is intended to include within its scope equivalent embodiments or modifications that do not depart from the technical spirit of the present disclosure.
The present disclosure discloses: a1, a two-dimensional code location label, two-dimensional code location label sets up in the location mark point, two-dimensional code location label includes: the position identification information comprises a plurality of sub two-dimensional code patterns which store position identification information, wherein the sub two-dimensional code patterns form an n-m-dimensional two-dimensional code pattern array, and n and m are positive integers; the geometric center of the two-dimensional code pattern array is coincided with the center of the positioning mark point; the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graph for externally surrounding the sub two-dimensional code. A2, according to the two-dimensional code positioning label in A1, the outer contour sizes of the sub two-dimensional code patterns are equal or different, and the row spacing and the column spacing between every two sub two-dimensional code patterns are equal or different. A3, positioning the label according to the two-dimensional code A1, wherein the auxiliary graph is a closed graph or a pattern. A4, positioning the label according to the two-dimensional code A3, wherein the auxiliary graph is a square frame. A5, the two-dimensional code positioning label according to A1, further comprising identification information and a reference datum pattern, wherein: the identification information is arranged in the two-dimensional code positioning tag and is used for recording position additional information; the reference datum graph is arranged on the periphery of the two-dimensional code positioning label and used for providing a position datum when the two-dimensional code positioning label is arranged at a positioning mark point.
The present disclosure discloses: b6, a two-dimensional code based positioning navigation system, the two-dimensional code based positioning navigation system includes: a plurality of location mark points, a plurality of location label, image acquisition device, image processing apparatus and controlling means, wherein: the positioning mark points are respectively arranged in the area to be positioned, and each positioning mark point is correspondingly provided with unique identification position information and is used for marking a fixed position in the area to be positioned; the positioning labels are arranged corresponding to the positioning mark points, and two-dimensional code patterns are arranged on the positioning labels and used for bearing identification information to realize navigation and yaw correction on the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark points; the image acquisition device is used for acquiring an image containing the two-dimensional code pattern and sending the image to the image processing device; the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the moving object in the region to be positioned; the control device is used for generating a moving instruction according to the position information and the position deviation information of the moving object in the area to be positioned and sending the moving instruction to the moving object. B7, according to the system of B6, the positioning mark points are uniformly or non-uniformly distributed in the area to be positioned. B8, the system according to B6, wherein the two-dimensional code pattern is an n x m-dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, and n and m are positive integers. B9, according to the system of B8, the geometric center of the two-dimensional code pattern array coincides with the center of the positioning mark point. B10, according to the system of B8, the outer contour sizes of the sub two-dimensional code patterns are equal or different, and the row spacing and the column spacing between every two sub two-dimensional code patterns are equal or different. B11, according to the system of B8, the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graph for externally surrounding the sub two-dimensional code. B12, the system according to B11, the auxiliary graphic being a closed graphic or pattern. B13, the system according to B8, the two-dimensional code positioning label further comprises identification information and a reference datum graph, wherein: the identification information is arranged in the two-dimensional code positioning tag and is used for recording position additional information; the reference datum graph is arranged on the periphery of the two-dimensional code positioning label and used for providing a position datum when the two-dimensional code positioning label is arranged at a positioning mark point.
The present disclosure discloses: c14, a mobile robot, the mobile robot comprising: image acquisition device, image processing apparatus, controlling means and mobile device, wherein: the image acquisition device is used for acquiring an image containing a two-dimensional code pattern and sending the image to the image processing device; the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the mobile robot; the control device is used for generating a moving instruction according to the position information and the position deviation information of the mobile robot and sending the moving instruction to the mobile device; and the mobile device moves according to the movement instruction. C15, the mobile robot according to C14, the two-dimensional code pattern is an n x m-dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, wherein n and m are positive integers. C16, the mobile robot according to C15, wherein the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graph for externally surrounding the sub two-dimensional code. C17, according to the mobile robot of C15, the two-dimensional code pattern array sets up on the location label, the location label corresponds the setting with the location mark point in the mobile robot region, wherein, location mark point corresponds and is equipped with only discernment positional information. C18, the mobile robot according to C17, the geometric center of the two-dimensional code pattern array is coincident with the center of the positioning mark point. C19, the mobile robot of C17, the locator tag further comprising identification information and a reference fiducial pattern, wherein: the identification information is arranged in the positioning label and is used for recording position additional information; the reference datum pattern is arranged on the periphery of the positioning label and used for providing a position datum when the positioning label is arranged at the positioning mark point. C20, the mobile robot of C14, the optical axis of the image capture device passing through the geometric center of the mobile robot in a vertical plane.

Claims (15)

1. A two-dimension code based positioning navigation system is characterized in that the two-dimension code based positioning navigation system comprises: a plurality of location mark points, a plurality of location label, image acquisition device, image processing apparatus and controlling means, wherein:
the positioning mark points are respectively arranged in the area to be positioned, and each positioning mark point is correspondingly provided with unique identification position information and is used for marking a fixed position in the area to be positioned;
the positioning labels are arranged corresponding to the positioning mark points, and two-dimensional code patterns are arranged on the positioning labels and used for bearing identification information to realize navigation and yaw correction on the moving object, wherein the identification information comprises unique identification position information corresponding to the positioning mark points;
the image acquisition device is used for acquiring an image containing the two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the moving object in the region to be positioned;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the moving object in the area to be positioned and sending the moving instruction to the moving object.
2. The system according to claim 1, wherein the positioning mark points are distributed uniformly or non-uniformly in the area to be positioned.
3. The system of claim 1, wherein the two-dimensional code pattern is an n x m dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, wherein n and m are positive integers.
4. The system of claim 3, wherein the geometric center of the two-dimensional code pattern array coincides with the center of the positioning mark point.
5. The system of claim 3, wherein the outer contours of the sub two-dimensional code patterns are equal or different in size, and the row spacing and the column spacing between every two sub two-dimensional code patterns are equal or different.
6. The system of claim 3, wherein the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graphic externally enclosing the sub two-dimensional code.
7. The system of claim 6, wherein the auxiliary graphic is a closed graphic or pattern.
8. The system of claim 3, wherein the two-dimensional code positioning tag further comprises identification information and a reference pattern, wherein:
the identification information is arranged in the two-dimensional code positioning tag and is used for recording position additional information;
the reference datum graph is arranged on the periphery of the two-dimensional code positioning label and used for providing a position datum when the two-dimensional code positioning label is arranged at a positioning mark point.
9. A mobile robot, comprising: image acquisition device, image processing apparatus, controlling means and mobile device, wherein:
the image acquisition device is used for acquiring an image containing a two-dimensional code pattern and sending the image to the image processing device;
the image processing device is connected with the image acquisition device and is used for processing the image acquired by the image acquisition device to obtain the position information and the position offset information of the mobile robot;
the control device is used for generating a moving instruction according to the position information and the position deviation information of the mobile robot and sending the moving instruction to the mobile device;
and the mobile device moves according to the movement instruction.
10. The mobile robot of claim 9, wherein the two-dimensional code pattern is an n x m-dimensional two-dimensional code pattern array composed of a plurality of sub two-dimensional code patterns, wherein n and m are positive integers.
11. The mobile robot of claim 10, wherein the sub two-dimensional code pattern comprises a sub two-dimensional code and an auxiliary graphic externally enclosing the sub two-dimensional code.
12. The mobile robot of claim 10, wherein the two-dimensional code pattern array is disposed on a positioning tag, and the positioning tag is disposed in correspondence with a positioning mark point in an area where the mobile robot is located, wherein the positioning mark point is correspondingly disposed with unique identification position information.
13. The mobile robot of claim 12, wherein a geometric center of the two-dimensional code pattern array coincides with a center of the positioning mark point.
14. The mobile robot of claim 12, wherein the locator tag further comprises identification information and a reference fiducial pattern, wherein:
the identification information is arranged in the positioning label and is used for recording position additional information;
the reference datum pattern is arranged on the periphery of the positioning label and used for providing a position datum when the positioning label is arranged at the positioning mark point.
15. A mobile robot as claimed in claim 9, wherein the optical axis of the image capturing device passes through the geometric centre of the mobile robot in a vertical plane.
CN201820085791.6U 2018-01-18 2018-01-18 Positioning navigation system and robot based on two-dimensional code Active CN208937054U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820085791.6U CN208937054U (en) 2018-01-18 2018-01-18 Positioning navigation system and robot based on two-dimensional code

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820085791.6U CN208937054U (en) 2018-01-18 2018-01-18 Positioning navigation system and robot based on two-dimensional code

Publications (1)

Publication Number Publication Date
CN208937054U true CN208937054U (en) 2019-06-04

Family

ID=66713531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820085791.6U Active CN208937054U (en) 2018-01-18 2018-01-18 Positioning navigation system and robot based on two-dimensional code

Country Status (1)

Country Link
CN (1) CN208937054U (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108225303A (en) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code
CN110465752A (en) * 2019-09-04 2019-11-19 深圳市智远数控有限公司 A kind of printing paper pattern-cut method and its laser cutting device
CN110842942A (en) * 2019-11-26 2020-02-28 南京智能仿真技术研究院有限公司 Intelligent robot with high-precision positioning system
CN111325045A (en) * 2020-03-20 2020-06-23 福州符号信息科技有限公司 Bar code positioning system and bar code positioning method for scanner
CN112462762A (en) * 2020-11-16 2021-03-09 浙江大学 Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit
CN112987729A (en) * 2021-02-09 2021-06-18 灵动科技(北京)有限公司 Method and apparatus for controlling autonomous mobile robot
CN113208872A (en) * 2021-05-07 2021-08-06 上海羿生医疗科技有限公司 Upper limb rehabilitation training device and control method thereof
CN114330392A (en) * 2021-12-26 2022-04-12 中国电子科技集团公司第十四研究所 Multi-size self-adaptive code scanning system for radar array surface

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108225303A (en) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 Two-dimensional code positioning label, and positioning navigation system and method based on two-dimensional code
CN108225303B (en) * 2018-01-18 2024-06-14 港湾智能科技(苏州)有限公司 Two-dimensional code positioning label, positioning navigation system and method based on two-dimensional code
CN110465752A (en) * 2019-09-04 2019-11-19 深圳市智远数控有限公司 A kind of printing paper pattern-cut method and its laser cutting device
CN110465752B (en) * 2019-09-04 2021-06-04 深圳市智远数控有限公司 Photographic paper pattern cutting method and laser cutting equipment thereof
CN110842942A (en) * 2019-11-26 2020-02-28 南京智能仿真技术研究院有限公司 Intelligent robot with high-precision positioning system
CN111325045A (en) * 2020-03-20 2020-06-23 福州符号信息科技有限公司 Bar code positioning system and bar code positioning method for scanner
CN112462762A (en) * 2020-11-16 2021-03-09 浙江大学 Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit
CN112462762B (en) * 2020-11-16 2022-04-19 浙江大学 Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit
CN112987729A (en) * 2021-02-09 2021-06-18 灵动科技(北京)有限公司 Method and apparatus for controlling autonomous mobile robot
CN113208872A (en) * 2021-05-07 2021-08-06 上海羿生医疗科技有限公司 Upper limb rehabilitation training device and control method thereof
CN114330392A (en) * 2021-12-26 2022-04-12 中国电子科技集团公司第十四研究所 Multi-size self-adaptive code scanning system for radar array surface
CN114330392B (en) * 2021-12-26 2024-02-27 中国电子科技集团公司第十四研究所 Multi-size self-adaptive code scanning system for radar array surface

Similar Documents

Publication Publication Date Title
CN108225303B (en) Two-dimensional code positioning label, positioning navigation system and method based on two-dimensional code
CN208937054U (en) Positioning navigation system and robot based on two-dimensional code
CN107507167B (en) Cargo tray detection method and system based on point cloud plane contour matching
Lee et al. Low-cost 3D motion capture system using passive optical markers and monocular vision
CN109099915B (en) Mobile robot positioning method, mobile robot positioning device, computer equipment and storage medium
KR20200041355A (en) Simultaneous positioning and mapping navigation method, device and system combining markers
CN107609451A (en) A kind of high-precision vision localization method and system based on Quick Response Code
CN112734852A (en) Robot mapping method and device and computing equipment
CN109509221B (en) Positioning measurement system based on image ruler
CN111964680B (en) Real-time positioning method of inspection robot
US20140184815A1 (en) Calibration plate for calibrating a plurality of image capturing devices and method for calibrating a plurality of image capturing devices
CN103824275A (en) System and method for finding saddle point-like structures in an image and determining information from the same
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN113221953B (en) Target attitude identification system and method based on example segmentation and binocular depth estimation
CN115609591A (en) 2D Marker-based visual positioning method and system and composite robot
CN109190434B (en) Bar code recognition algorithm based on sub-pixel level corner detection
CN115239822A (en) Real-time visual identification and positioning method and system for multi-module space of split type flying vehicle
CN207976755U (en) A kind of steel warehouse control system based on machine vision and PLC
CN111488762A (en) Lane-level positioning method and device and positioning equipment
CN114266822B (en) Workpiece quality inspection method and device based on binocular robot, robot and medium
CN112684792B (en) Two-dimensional code array label detection method and storage device
CN114862953A (en) Mobile robot repositioning method and device based on visual features and 3D laser
CN112446895B (en) Automatic extraction method, system, equipment and medium for checkerboard corner points
CN113095104A (en) Defective two-dimensional code positioning method
KR102077934B1 (en) Method for generating alignment data for virtual retrofitting object using video and Terminal device for performing the same

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220519

Address after: 215200 4th floor, Tianze new energy building, No. 99, yunchuang Road, Jiangling street, Wujiang District, Suzhou City, Jiangsu Province

Patentee after: Gangwan Intelligent Technology (Suzhou) Co.,Ltd.

Address before: 315191 Yee Mun Village (Yinzhou District science and Technology Park), Jiangshan Town, Yinzhou District, Ningbo, Zhejiang

Patentee before: WATER ROCK INTELLIGENT TECHNOLOGY (NINGBO) Co.,Ltd.

TR01 Transfer of patent right