CN112180926B - Linear guiding method and system of sweeping robot and sweeping robot - Google Patents

Linear guiding method and system of sweeping robot and sweeping robot Download PDF

Info

Publication number
CN112180926B
CN112180926B CN202011045423.7A CN202011045423A CN112180926B CN 112180926 B CN112180926 B CN 112180926B CN 202011045423 A CN202011045423 A CN 202011045423A CN 112180926 B CN112180926 B CN 112180926B
Authority
CN
China
Prior art keywords
moving image
line
sweeping robot
straight line
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011045423.7A
Other languages
Chinese (zh)
Other versions
CN112180926A (en
Inventor
刘世勇
苟萧华
金秀芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Grand Pro Robot Technology Co ltd
Original Assignee
Hunan Grand Pro Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Grand Pro Robot Technology Co ltd filed Critical Hunan Grand Pro Robot Technology Co ltd
Priority to CN202011045423.7A priority Critical patent/CN112180926B/en
Publication of CN112180926A publication Critical patent/CN112180926A/en
Application granted granted Critical
Publication of CN112180926B publication Critical patent/CN112180926B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides a linear guiding method and a linear guiding system for a sweeping robot and the sweeping robot, wherein after simple edge detection is carried out on an image, all linear paths in the image are obtained based on Hoff line transformation on pixel points, and the best linear paths are obtained by fitting all the linear paths.

Description

Linear guiding method and system of sweeping robot and sweeping robot
[ field of technology ]
The application relates to the field of sweeping robots, in particular to a linear guiding method and system of a sweeping robot and the sweeping robot.
[ background Art ]
The automatic sweeping robot is widely applied in both markets and home environments, and mainly cleans through a preset path.
The existing vision-based sweeping robot can acquire the advancing path by acquiring images and carrying out image recognition, but a more complex path in a scene often causes low extraction efficiency of the robot on a straight line path, and the best straight line cannot be extracted.
[ application ]
In order to solve the problem that the existing sweeping robot is low in linear path extraction efficiency, the application provides a linear guiding method and system of the sweeping robot and the sweeping robot.
The application provides a technical scheme for solving the technical problems as follows: a straight line guiding method of a sweeping robot comprises the following steps: step S1: acquiring a moving image of a moving area of the sweeping robot, wherein the moving area comprises a plurality of linear paths, and the moving image comprises a central line; step S2: performing edge detection on the moving image to obtain a line contour of an object in the moving image; step S3: performing Hough line transformation based on a plurality of pixel points forming a line contour in a moving image so as to be displayed under a Hough coordinate system, and calculating to obtain all straight-line paths in the moving image; step S4: and calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path, and driving the sweeping robot to move along the parallel direction of the virtual straight line direction.
Preferably, the step S2 specifically includes the following steps: step S21: respectively carrying out pixel gradient calculation on the moving image in the vertical direction and the horizontal direction to obtain a vertical pixel diagram and a horizontal pixel diagram; step S22: and superposing the vertical pixel diagram and the horizontal pixel diagram to obtain the line outline of the object in the moving image.
Preferably, the step S4 specifically includes the following steps: step S41: dividing each included angle into an integer part and a decimal part; step S42: mapping the integer part to a chi-square distribution model, taking an extremum of normal distribution, and taking the extremum as the integer part of the corresponding included angle of the virtual straight line; step S43: and averaging the decimal parts to obtain the decimal part of the included angle corresponding to the virtual straight line.
Preferably, between the step S1 and the step S2, further comprises: step S100: and setting a mask, wherein the mask is based on a preset threshold value, the pixels higher than the preset threshold value in the moving image are replaced by the mask, and the filtered moving image is obtained after the picture is redrawn.
Preferably, after step S100, further includes: step S101: and carrying out denoising processing on the filtered moving image based on high-speed filtering.
The application also provides a linear guide system of the sweeping robot, which comprises: the image acquisition unit is used for acquiring a moving image of a moving area of the sweeping robot, wherein the moving area comprises a plurality of straight paths, and the moving image comprises a center line; the outline initial calculation unit is used for carrying out edge detection on the moving image to obtain the line outline of the object in the moving image; the path calculation unit is used for carrying out Hough line transformation based on a plurality of pixel points forming line outlines in the moving image so as to be displayed under a Hough coordinate system, and calculating to obtain all straight-line paths in the moving image; and the path fitting unit is used for calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path and driving the sweeping robot to move along the parallel direction of the virtual straight line direction.
Preferably, the contour preliminary calculation unit further includes: the gradient calculating unit is used for respectively calculating pixel gradients of the moving image in the vertical direction and the horizontal direction to obtain a vertical pixel image and a horizontal pixel image; and the image superposition unit is used for superposing the vertical pixel diagram and the horizontal pixel diagram to obtain the line outline of the object in the moving image.
Preferably, the path fitting unit includes: the data dividing unit is used for dividing each included angle into an integer part and a decimal part; the integer solving unit is used for mapping the integer part to a chi-square distribution model, taking an extremum of normal distribution and taking the extremum as the integer part of the included angle corresponding to the virtual straight line; and the decimal solving unit is used for obtaining the decimal part of the included angle corresponding to the virtual straight line after averaging the decimal part.
The application also provides a sweeping robot, which comprises a shell, a wheel set and a camera, wherein the wheel set is arranged at the bottom of the shell, the camera is arranged on the shell and is rotationally connected with the shell, and the wheel set is rotationally connected with the shell; the camera is used for acquiring the moving image.
Compared with the prior art, the linear guiding method and system of the sweeping robot and the sweeping robot provided by the application have the following advantages:
1. after the image is subjected to simple edge detection, all straight-line paths in the image are obtained based on Hoff line transformation on the pixel points, and the best straight-line paths are obtained by fitting all the straight-line paths, so that the straight-line guiding method of the sweeping robot provided by the second embodiment of the application can be based on a simple image edge detection and identification mode, and simultaneously combines coordinate transformation and straight-line path fitting, thereby greatly reducing the calculated amount required by image identification, improving the calculation efficiency, namely improving the efficiency of the sweeping robot for extracting the straight-line paths and reducing the requirements on a processor.
2. The pixel gradients are calculated in the horizontal direction and the vertical direction and then overlapped to identify the outline formed by all objects in the moving image, so that the preliminary image identification step is carried out, a large amount of calculation of the image depth identification objects is reduced, and the detection efficiency is improved.
3. The integral part is displayed to be normal distribution through the chi-square distribution model, the normal distribution extremum, namely the value with the largest appearance of the integral part of the included angle is taken as the integral part of the included angle of the virtual straight line, and the decimal part is taken as the arithmetic average value to form the angle data of the virtual straight line after the average value is taken, so that the optimal virtual straight line path is accurately fitted in a plurality of direct paths in a mode of respectively solving the integer and the decimal through the angles, and the interference of other paths is avoided.
4. And performing mask replacement on the pixel points exceeding the preset threshold by setting a mask, so that the strong light pixel value in the image is blocked, and the influence of strong light on the image is eliminated.
5. By means of Gaussian filtering, noise influence generated in the image acquired by the camera 13 is eliminated, and the recognition accuracy of the straight line path is improved.
[ description of the drawings ]
Fig. 1 is a schematic structural diagram of a sweeping robot according to a first embodiment of the present application.
Fig. 2 is a schematic view of a straight path and an obstacle in a moving image of the sweeping robot.
Fig. 3 is an overall flowchart of a linear guiding method of a sweeping robot according to a second embodiment of the present application.
Fig. 4 is a detailed flowchart of step S2 in a linear guiding method of a sweeping robot according to a second embodiment of the present application.
Fig. 5 is a detailed flowchart of step S4 in a linear guiding method of a sweeping robot according to a second embodiment of the present application.
Fig. 6 is a block diagram of a linear guide system of a sweeping robot according to a third embodiment of the present application.
Fig. 7 is a block diagram of a contour initial calculation unit in a linear guide system of a sweeping robot according to a third embodiment of the present application.
Fig. 8 is a block diagram of a path fitting unit in a linear guide system of a sweeping robot according to a third embodiment of the present application.
Reference numerals illustrate:
1-a robot for sweeping the floor,
11-shell, 12-wheel group, 13-camera,
a linear guide system of a 100-sweeping robot,
101-an image acquisition unit, 102-a contour preliminary calculation unit, 103-a path calculation unit, 104-a path fitting unit,
1021-a gradient obtaining unit, 1022-an image superimposing unit,
1041-a data dividing unit, 1042-an integer solving unit, 1043-a decimal solving unit,
200-moving images, 201-straight path, 202-centerline.
[ detailed description ] of the application
For the purpose of making the technical solution and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and examples of implementation. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Referring to fig. 1 and 2, a first embodiment of the present application provides a sweeping robot 1, which includes a housing 11, a wheel set 12 and a camera 13, wherein the wheel set 12 is disposed at the bottom of the housing 11, the camera 13 is disposed on the housing 11, the camera 13 is rotatably connected with the housing 11, and the wheel set 12 is rotatably connected with the housing 11. The camera 13 is used for acquiring a moving image of a moving area where the sweeping robot 1 is located, and the wheel set 12 is used for driving the sweeping robot 1 to move in the moving area.
It will be appreciated that the wheel set 12 is rotatably connected to the housing 11, so that the sweeping robot 1 needs to change the advancing angle when moving, and the advancing angle can be adjusted based on the wheel set 12 rotating without adjusting the angle of the entire housing 11.
It can be appreciated that the camera 13 is rotatably connected with the housing 11, so that the camera 13 can rotate based on itself to obtain image data of each angle of the moving area where the sweeping robot 1 is located, thereby reducing the blind area of the sweeping robot 1 and the redundant steps of turning around to obtain the forward image.
It will be appreciated that the housing 11 may be provided in a disc configuration, a cube configuration or other configuration, and in this embodiment, the housing 11 is provided in a disc configuration.
Referring to fig. 2 and 3, a second embodiment of the present application provides a linear guiding method of a sweeping robot, which performs a linear path recognition step by the sweeping robot 1 provided in the first embodiment, and includes the following steps:
step S1: acquiring a moving image of a moving area of the sweeping robot, wherein the moving area comprises a plurality of linear paths, and the moving image comprises a central line;
step S2: performing edge detection on the moving image to obtain a line contour of an object in the moving image;
step S3: performing Hough line transformation based on a plurality of pixel points forming a line contour in a moving image so as to be displayed under a Hough coordinate system, and calculating to obtain all straight-line paths in the moving image; and
Step S4: and calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path, and driving the sweeping robot to move along the parallel direction of the virtual straight line direction.
It can be understood that in step S1, the robot 1 acquires the moving image 200 of the moving area based on the camera 13, the moving area has a plurality of straight paths 201, and after the moving image 200 is generated, a center line 202 is automatically generated, and the center line 202 may be horizontally arranged or vertically arranged, so that the subsequent path can be identified as a reference.
It can be understood that in step S2, edge detection is performed on the moving image 200 to obtain image information identifiable by the robot 1. Since the robot 1 cannot recognize the straight path 201 and the obstacle 203 (as shown in fig. 2) on the moving image that is originally obtained, the edge detection may be performed on the moving image to primarily recognize the object contour in the image, where the object contour may include a contour line formed by the obstacle, the traveling path, or other objects.
It can be understood that in step S3, based on the image obtained after the edge detection in step S2, hough line transformation is performed on a plurality of pixel points that compose the line profile in the image, so as to display the pixel points in the hough coordinate system. The hough transform is a feature detection (feature extraction) widely used in image analysis (image analysis), computer vision (computer vision) and digital image processing (digital image processing), for example, for distinguishing line features in objects. In this embodiment, after the hough line transformation, a new hough coordinate system is obtained, and all (x, y) sets passing through the same point under the hough coordinate system are points on the target straight line, so as to accurately find all the straight line paths 201 in the moving image 200.
It will be appreciated that in the Huo Fuxian transformation, since different points on the same line are lines in the parameter plane that would pass through the same point. The hough transform is performed on all points of the image, and detecting the straight line means finding the point in the corresponding parameter plane where the straight line intersects the most. And accumulating the ticket numbers of the intersection points, and then taking out the points with the ticket numbers larger than the minimum ticket number, namely the straight line detected in the original coordinate system. That is, huo Fuxian transforms the intersection point between curves corresponding to each point in the trace image, and if the number of curves intersecting a point exceeds a threshold value, the parameter represented by the intersection point can be considered as a straight line in the original image.
It can be understood that in step S4, based on the plurality of straight-line paths meeting the requirements obtained in step S3, an included angle between each straight-line path 201 and the center line 202 is calculated, and a virtual straight line meeting the requirements is fitted based on the included angle, so that an optimal straight line selected from the plurality of straight-line paths 201 can be obtained.
It can be understood that after the image is simply detected by the edge, all the straight-line paths in the image are obtained based on the Hoff line transformation of the pixel points, and the best straight-line paths are obtained by fitting all the straight-line paths, so that the straight-line guiding method of the sweeping robot provided by the second embodiment of the application can be based on a simple image edge detection and identification mode, and simultaneously combines the coordinate transformation and the fitting of the straight-line paths, thereby greatly reducing the calculation amount required by the image identification, improving the calculation efficiency, namely improving the efficiency of the sweeping robot 1 for extracting the straight-line paths and reducing the requirements on a processor.
Optionally, referring to fig. 3, as an embodiment, between step S1 and step S2, further includes:
step S100: and setting a mask, wherein the mask is based on a preset threshold value, the pixels higher than the preset threshold value in the moving image are replaced by the mask, and the filtered moving image is obtained after the picture is redrawn.
After step S100, the method further comprises:
step S101: and carrying out denoising processing on the filtered moving image based on high-speed filtering.
It will be appreciated that in step S100, the preset threshold of the mask may be set to 200, that is, pixels exceeding 200 pixel values in the moving image will be replaced by the mask, so that the strong light pixel values in the image are blocked, and the influence of the strong light on the image is eliminated.
It can be understood that in step S101, the noise effect generated in the image acquired by the camera 13 is eliminated by the gaussian filtering, so as to improve the recognition accuracy of the straight line path.
Referring to fig. 4, step S2: and carrying out edge detection on the moving image to obtain the line contour of the object in the moving image. The step S4 specifically includes steps S21 to S22:
step S21: respectively carrying out pixel gradient calculation on the moving image in the vertical direction and the horizontal direction to obtain a vertical pixel diagram and a horizontal pixel diagram; and
Step S22: and superposing the vertical pixel diagram and the horizontal pixel diagram to obtain the line outline of the object in the moving image.
It can be understood that in step S21, the skeleton of the first approximation in the horizontal direction and the vertical direction of the picture can be calculated and drawn based on the sobel operator, and by calculating the gradient direction and the amplitude of each pixel again, the pixels in the amplitude range are removed by the non-maximum suppression method, and the amplitude of the current pixel point is enhanced.
It can be understood that in step S21 and step S22, by calculating the gradient amplitude difference of the pixel points in the horizontal direction, convolution operation is performed in the vertical direction, and then convolution operation is performed in the horizontal direction to obtain the vertical edge; calculating gradient amplitude differences of pixel points in the y direction, performing convolution operation in the horizontal direction, performing convolution operation in the vertical direction to obtain a horizontal edge, and finally overlapping a vertical edge image with the horizontal edge image to obtain the outline of the whole image.
It can be understood that the pixel gradients are calculated in the horizontal direction and the vertical direction and then overlapped to identify the outline formed by all objects in the moving image, so that the preliminary image identification step is performed, a large amount of calculation of the image depth identification objects is reduced, and the detection efficiency is improved.
It is to be understood that steps S21 to S22 are only one implementation of this example, and implementation thereof is not limited to steps S21 to S22.
Referring to fig. 5, step S4: and calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path, and driving the sweeping robot to move along the parallel direction of the virtual straight line direction. The step S4 specifically includes steps S41 to S43:
step S41: dividing each included angle into an integer part and a decimal part;
step S42: mapping the integer part to a chi-square distribution model, taking an extremum of normal distribution, and taking the extremum as the integer part of the corresponding included angle of the virtual straight line; and
Step S43: and averaging the decimal parts to obtain the decimal part of the included angle corresponding to the virtual straight line.
It is understood that in step S41, each straight path forms an angle with the center line, and each angle is divided into an integer part and a fractional part.
It will be understood that in step S42, the integer part displayed by the chi-square distribution model is represented as a normal distribution, and the extreme value of the normal distribution, that is, the value at which the integer part of the included angle appears most, is taken as the integer part of the virtual straight line included angle.
It can be understood that in step S43, the fractional part of the corresponding included angle of the virtual straight line is obtained by averaging the fractional parts, and the path direction of the virtual straight line is obtained by combining the integer part and the fractional part.
It is to be understood that steps S41 to S43 are only one implementation of this example, and implementation thereof is not limited to steps S41 to S43.
Referring to fig. 6, a third embodiment of the present application provides a linear guiding system 100 of a sweeping robot, which includes:
an image obtaining unit 101, configured to obtain a moving image of a moving area of the sweeping robot, where the moving area includes a plurality of straight paths, and the moving image includes a center line;
the contour initial calculation unit 102 is configured to perform edge detection on the moving image to obtain a line contour of an object in the moving image;
a path calculation unit 103, configured to perform hough line transformation based on a plurality of pixel points that form a line contour in a moving image, so as to display the line contour in a hough coordinate system, and calculate and obtain all straight-line paths in the moving image; and
And the path fitting unit 104 is configured to calculate and fit a virtual straight line based on an included angle between the center line and each straight line path, and drive the sweeping robot to move along a parallel direction of the virtual straight line direction.
Referring to fig. 7, the contour initial calculating unit 102 further includes:
a gradient calculating unit 1021, configured to perform pixel gradient calculation on the moving image in a vertical direction and a horizontal direction, respectively, to obtain a vertical pixel map and a horizontal pixel map; and
And an image superposition unit 1022, configured to superimpose the vertical pixel map and the horizontal pixel map, and obtain a line contour of the object in the moving image.
With continued reference to fig. 8, the path fitting unit 104 further includes:
a data dividing unit 1041, configured to divide each included angle into an integer part and a fractional part;
an integer solving unit 1042, configured to map the integer part to a chi-square distribution model, obtain an extremum of normal distribution, and use the extremum as an integer part of an included angle corresponding to the virtual straight line; and
The decimal obtaining unit 1043 is configured to average the decimal parts to obtain a decimal part of the angle corresponding to the virtual straight line.
Compared with the prior art, the linear guiding method and system of the sweeping robot and the sweeping robot provided by the application have the following advantages:
1. after the image is subjected to simple edge detection, all straight-line paths in the image are obtained based on Hoff line transformation on the pixel points, and the best straight-line paths are obtained by fitting all the straight-line paths, so that the straight-line guiding method of the sweeping robot provided by the second embodiment of the application can be based on a simple image edge detection and identification mode, and simultaneously combines coordinate transformation and straight-line path fitting, thereby greatly reducing the calculated amount required by image identification, improving the calculation efficiency, namely improving the efficiency of the sweeping robot for extracting the straight-line paths and reducing the requirements on a processor.
2. The pixel gradients are calculated in the horizontal direction and the vertical direction and then overlapped to identify the outline formed by all objects in the moving image, so that the preliminary image identification step is carried out, a large amount of calculation of the image depth identification objects is reduced, and the detection efficiency is improved.
3. The integral part is displayed to be normal distribution through the chi-square distribution model, the normal distribution extremum, namely the value with the largest appearance of the integral part of the included angle is taken as the integral part of the included angle of the virtual straight line, and the decimal part is taken as the arithmetic average value to form the angle data of the virtual straight line after the average value is taken, so that the optimal virtual straight line path is accurately fitted in a plurality of direct paths in a mode of respectively solving the integer and the decimal through the angles, and the interference of other paths is avoided.
4. And performing mask replacement on the pixel points exceeding the preset threshold by setting a mask, so that the strong light pixel value in the image is blocked, and the influence of strong light on the image is eliminated.
5. By means of Gaussian filtering, noise influence generated in the image acquired by the camera 13 is eliminated, and the recognition accuracy of the straight line path is improved.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts.
The above-described functions defined in the method of the application are performed when the computer program is executed by a processor. It should be noted that, the computer memory according to the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer memory may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing.
More specific examples of computer memory may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable signal medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes an image acquisition unit, a contour preliminary calculation unit, a path calculation unit, and a path fitting unit. The names of these units do not limit the unit itself in some cases, and for example, the contour calculation unit may also be described as "a unit that performs edge detection on the moving image to obtain a line contour of an object in the moving image".
As another aspect, the present application also provides a computer memory, which may be included in the apparatus described in the above embodiment; or may be present alone without being fitted into the device. The computer memory carries one or more programs that, when executed by the apparatus, cause the apparatus to: acquiring a moving image of a moving area of the sweeping robot, wherein the moving area comprises a plurality of linear paths, and the moving image comprises a central line; performing edge detection on the moving image to obtain a line contour of an object in the moving image; performing Hough line transformation based on a plurality of pixel points forming a line contour in a moving image so as to be displayed under a Hough coordinate system, and calculating to obtain all straight-line paths in the moving image; and calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path, and driving the sweeping robot to move along the parallel direction of the virtual straight line direction.
The above embodiments are merely preferred embodiments of the present application, and are not intended to limit the present application, but any modifications, equivalents, improvements, etc. within the principles of the present application should be included in the scope of the present application.

Claims (3)

1. A linear guiding method of a sweeping robot is characterized in that: the method comprises the following steps:
step S1: acquiring a moving image of a moving area of the sweeping robot, wherein the moving area comprises a plurality of linear paths, and the moving image comprises a central line;
step S2: performing edge detection on the moving image to obtain a line contour of an object in the moving image;
step S3: performing Hough line transformation based on a plurality of pixel points forming a line contour in a moving image so as to be displayed under a Hough coordinate system, and calculating to obtain all straight-line paths in the moving image; and
Step S4: calculating and fitting a virtual straight line based on the included angle between the central line and each straight line path, and driving the sweeping robot to move along the parallel direction of the virtual straight line direction;
the step S2 specifically includes the following steps:
step S21: respectively carrying out pixel gradient calculation on the moving image in the vertical direction and the horizontal direction to obtain a vertical pixel diagram and a horizontal pixel diagram; and
Step S22: superposing the vertical pixel diagram and the horizontal pixel diagram to obtain a line contour of an object in the moving image;
the step S4 specifically includes the following steps:
step S41: dividing each included angle into an integer part and a decimal part;
step S42: mapping the integer part to a chi-square distribution model, taking an extremum of normal distribution, and taking the extremum as the integer part of the corresponding included angle of the virtual straight line; and
Step S43: and averaging the decimal parts to obtain the decimal part of the included angle corresponding to the virtual straight line.
2. The straight line guiding method of a sweeping robot according to claim 1, wherein: the steps S1 and S2 further include:
step S100: and setting a mask, wherein the mask is based on a preset threshold value, the pixels higher than the preset threshold value in the moving image are replaced by the mask, and the filtered moving image is obtained after the picture is redrawn.
3. The straight line guiding method of the sweeping robot according to claim 2, wherein: after step S100, the method further comprises:
step S101: and carrying out denoising processing on the filtered moving image based on high-speed filtering.
CN202011045423.7A 2020-09-28 2020-09-28 Linear guiding method and system of sweeping robot and sweeping robot Active CN112180926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011045423.7A CN112180926B (en) 2020-09-28 2020-09-28 Linear guiding method and system of sweeping robot and sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011045423.7A CN112180926B (en) 2020-09-28 2020-09-28 Linear guiding method and system of sweeping robot and sweeping robot

Publications (2)

Publication Number Publication Date
CN112180926A CN112180926A (en) 2021-01-05
CN112180926B true CN112180926B (en) 2023-10-03

Family

ID=73946471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011045423.7A Active CN112180926B (en) 2020-09-28 2020-09-28 Linear guiding method and system of sweeping robot and sweeping robot

Country Status (1)

Country Link
CN (1) CN112180926B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101608924A (en) * 2009-05-20 2009-12-23 电子科技大学 A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform
CN101794442A (en) * 2010-01-25 2010-08-04 哈尔滨工业大学 Calibration method for extracting illumination-insensitive information from visible images
CN106504182A (en) * 2016-11-02 2017-03-15 山东正晨科技股份有限公司 A kind of extraction of straight line system based on FPGA
CN109947114A (en) * 2019-04-12 2019-06-28 南京华捷艾米软件科技有限公司 Robot complete coverage path planning method, device and equipment based on grating map
CN111652033A (en) * 2019-12-12 2020-09-11 苏州奥易克斯汽车电子有限公司 Lane line detection method based on OpenCV

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7349123B2 (en) * 2004-03-24 2008-03-25 Lexmark International, Inc. Algorithms and methods for determining laser beam process direction position errors from data stored on a printhead

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101608924A (en) * 2009-05-20 2009-12-23 电子科技大学 A kind of method for detecting lane lines based on gray scale estimation and cascade Hough transform
CN101794442A (en) * 2010-01-25 2010-08-04 哈尔滨工业大学 Calibration method for extracting illumination-insensitive information from visible images
CN106504182A (en) * 2016-11-02 2017-03-15 山东正晨科技股份有限公司 A kind of extraction of straight line system based on FPGA
CN109947114A (en) * 2019-04-12 2019-06-28 南京华捷艾米软件科技有限公司 Robot complete coverage path planning method, device and equipment based on grating map
CN111652033A (en) * 2019-12-12 2020-09-11 苏州奥易克斯汽车电子有限公司 Lane line detection method based on OpenCV

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Marcos Ogaz.Data Processing from a Laser Range Finder Sensor for the Construction of Geometric Maps of an Indoor Environment.《IEEE》.2009,全文. *
基于改进直线特征提取算法的室内移动机器人地图构建;简明;唐墨臻;张翠芳;闫飞;;计算机工程(01);全文 *
郝泽兴 ; 郭改枝 ; .基于改进Canny算子和Hough变换的表格图像校正方法.内蒙古师范大学学报(自然科学汉文版).2020,(05),全文. *
马 一 萌.复杂路况的车道线检测与识别算法研究.《中国优秀硕博士论文全文数据库》.2016,全文. *

Also Published As

Publication number Publication date
CN112180926A (en) 2021-01-05

Similar Documents

Publication Publication Date Title
US10373380B2 (en) 3-dimensional scene analysis for augmented reality operations
CN110163912B (en) Two-dimensional code pose calibration method, device and system
US9786062B2 (en) Scene reconstruction from high spatio-angular resolution light fields
EP2570993B1 (en) Egomotion estimation system and method
US8848978B2 (en) Fast obstacle detection
CN107123142B (en) Pose estimation method and device
WO2011145436A1 (en) Methods, apparatus, program and media for edge detection technique having improved feature visibility
US20230005278A1 (en) Lane extraction method using projection transformation of three-dimensional point cloud map
CN110619674B (en) Three-dimensional augmented reality equipment and method for accident and alarm scene restoration
CN112258519B (en) Automatic extraction method and device for way-giving line of road in high-precision map making
CN105205459B (en) A kind of recognition methods of characteristics of image vertex type and device
CN112927303B (en) Lane line-based automatic driving vehicle-mounted camera pose estimation method and system
EP4058874A1 (en) Method and system for associating device coordinate systems in a multi-person ar system
CN112991374A (en) Canny algorithm-based edge enhancement method, device, equipment and storage medium
CN114782529B (en) Live working robot-oriented line grabbing point high-precision positioning method, system and storage medium
US20210049382A1 (en) Non-line of sight obstacle detection
CN112180926B (en) Linear guiding method and system of sweeping robot and sweeping robot
CN117611525A (en) Visual detection method and system for abrasion of pantograph slide plate
WO2016087173A1 (en) Method and apparatus for providing increased obstacle visibility
CN115468578B (en) Path planning method and device, electronic equipment and computer readable medium
CN114596307A (en) Method for measuring length of hanger of railway contact net based on unmanned aerial vehicle and machine vision
Cao et al. Depth image vibration filtering and shadow detection based on fusion and fractional differential
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN116703952B (en) Method and device for filtering occlusion point cloud, computer equipment and storage medium
CN110706334B (en) Three-dimensional reconstruction method for industrial part based on three-dimensional vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant