CN114162279A - Ship cleaning method and device based on machine vision - Google Patents

Ship cleaning method and device based on machine vision Download PDF

Info

Publication number
CN114162279A
CN114162279A CN202111260358.4A CN202111260358A CN114162279A CN 114162279 A CN114162279 A CN 114162279A CN 202111260358 A CN202111260358 A CN 202111260358A CN 114162279 A CN114162279 A CN 114162279A
Authority
CN
China
Prior art keywords
robot
cleaning
area
stain
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111260358.4A
Other languages
Chinese (zh)
Other versions
CN114162279B (en
Inventor
陈钟恭
郑华锋
张飞檐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Weibai Industrial Robot Co ltd
Original Assignee
Fujian Weibai Industrial Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Weibai Industrial Robot Co ltd filed Critical Fujian Weibai Industrial Robot Co ltd
Priority to CN202111260358.4A priority Critical patent/CN114162279B/en
Publication of CN114162279A publication Critical patent/CN114162279A/en
Application granted granted Critical
Publication of CN114162279B publication Critical patent/CN114162279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B59/00Hull protection specially adapted for vessels; Cleaning devices specially adapted for vessels
    • B63B59/06Cleaning devices for hulls

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a ship cleaning method and a device based on machine vision, wherein the method comprises the following steps: acquiring a plurality of rectangular stain areas; obtaining the traveling track of the robot according to the abscissa or the ordinate of the central point of each stain area; dividing each stain area into M equal parts to obtain a plurality of cleaning areas, and obtaining cleaning tracks according to the equal division lines and the side lines; judging whether the robot deviates from the traveling track every time the robot travels a certain distance, and if so, adjusting the posture of the robot to return to the traveling track; after reaching the target stain area, cleaning is performed according to the cleaning trajectory. The invention does not need to manually control the specific cleaning process, and effectively improves the cleaning efficiency and the cleaning effect.

Description

Ship cleaning method and device based on machine vision
Technical Field
The invention relates to a ship cleaning method and device based on machine vision.
Background
A ship is a vehicle that can be sailed or moored in a water area for transportation or operation, and has different technical performances, equipment and structures according to different use requirements. When the ship is used, the outer plate of the ship is soaked in water for a long time and is easily influenced by mud and microorganisms, so that a lot of dirt and organisms are adhered to the surface of the outer plate, even the outer plate is rusted, and the structure of the ship is damaged, so that the ship needs to be cleaned irregularly.
Due to the particularity of the shape and the structure of the ship, the ship is generally cleaned by using a robot, but the existing robot needs manual control to complete the cleaning task, so that time and labor are consumed, and the cleaning efficiency is low.
Disclosure of Invention
The invention provides a ship cleaning method and device based on machine vision, wherein a robot carries out automatic cleaning according to a traveling track and a cleaning track without manually controlling a specific cleaning process, and the cleaning efficiency and the cleaning effect are effectively improved.
The invention is realized by the following technical scheme:
a ship cleaning method based on machine vision comprises the following steps:
A. acquiring an initial image of a part to be cleaned of a ship, and marking a plurality of rectangular stain areas in the initial image;
B. dividing all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determining the sequence of the robot passing through each stain area of the ith area group in turn according to the ordinate or the abscissa of the center point, entering the stain area which is positioned at the end part and is closest to the stain area in the (i + 1) th area group after the ith area group passes through the last stain area, and reciprocating in turn to obtain the advancing track of the robot;
C. dividing each stain area into a plurality of cleaning areas by M in the transverse direction or the longitudinal direction, wherein the width of each cleaning area is smaller than the diameter of a cleaning disc of the robot, each bisector and a side line of each rectangular stain area are cleaning tracks of the robot in the stain areas, and the cleaning tracks are horizontally arranged to the bisector adjacent to the side line from one end of the side line parallel to the bisector to the other end of the side line until the cleaning tracks are arranged to the end part of the other side line parallel to the bisector;
D. controlling the robot to start to travel from the starting point of the travel track, and taking the starting point as a first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the current position of the robot as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from the traveling track or not according to the actual traveling direction, if so, entering step F, otherwise, entering step G;
F. stopping the robot, adjusting the posture of the robot and enabling the robot to move according to a new moving track, wherein the new moving track is a connecting line between the stopping point and the moving track end point, and entering the step G;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering the step H;
H. the robot position at this time is set to the new first position, and the process proceeds to step E.
Further, the acquiring of the actual traveling direction of the robot in the step E includes the following steps:
e1, after the robot travels a certain distance, shooting a second image containing the robot, and inputting the image into the trained deep learning model to identify the robot;
e2, judging whether the confidence of the recognition result in the step E1 reaches a preset value, if so, entering the step E3, otherwise, entering the step E1;
e3, obtaining an ROI area according to the recognition result, and carrying out proportional transformation and Gaussian filtering on the ROI area;
e4, extracting the target feature information of the ROI processed in the step E3 to obtain at least one target feature, and taking the feature target with the minimum difference between the central position and the first position as a final target, wherein the central position of the final target is the second position;
and E5, acquiring a first vector from the first position to the second position, wherein the direction of the first vector is the actual traveling direction of the robot.
Further, the step E of determining whether the robot deviates from the travel track includes the following steps: setting the traveling path direction of the robot as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from a traveling track when the included angle theta is larger than a preset included angle threshold value; and in the step F, the robot is reversely rotated by theta to adjust the posture.
Further, the step B specifically includes:
b1, acquiring the coordinates of the central points of the stain areas, and arranging the central points according to the size of the abscissa or ordinate values;
b2, dividing the arranged central points into N groups, namely dividing each stain area into N area groups, wherein the abscissa or ordinate value of the central point of each stain area in each area group is in the same range;
b3, arranging the central points of the stain areas in the ith area group according to the size of the ordinate or the abscissa, namely, obtaining the partial travel track in the area group;
and B4, after passing through the last stain area of the ith area group according to the partial traveling track, the robot enters a stain area which is positioned at the end part and is closest to the stain area in the (i + 1) th area group, and the robot sequentially reciprocates to form the traveling track of the robot.
Further, the initial image includes the robot, and in the step C, before the M-equal division is performed, a ratio of an actual size of the robot to a size of the robot in the initial image is calculated to determine a width of the cleaning area, so as to determine a value of M.
Further, in the step E1, the distance corresponds to 10-15 frames of images.
Further, the moving track in step B includes a plurality of segments of partial tracks, each segment of partial track corresponds to a route between two stain areas, and in step E, it is determined whether the actual moving direction of the robot deviates from the moving track, and the moving track is a certain segment of partial track.
Furthermore, the ship is provided with a plurality of parts to be cleaned, each cleaning part corresponds to one robot, the cleaning parts are cleaned according to the steps A to H, and the robots between two adjacent parts to be cleaned can communicate in the cleaning process.
Further, the initial image and the second image are obtained through the pan-tilt camera, the pan-tilt camera and the ship are arranged at intervals, and images shot by the pan-tilt camera can contain the robot.
The invention is also realized by the following technical scheme:
the utility model provides a boats and ships belt cleaning device based on machine vision, includes cloud platform camera, control module and following module:
a stain area acquisition module: acquiring an initial image of a part to be cleaned of a ship through a pan-tilt camera and feeding the initial image back to a control module, and marking a plurality of rectangular stain areas in the initial image through the control module;
a travel track acquisition module: the control module divides all the stain areas into N area groups according to the abscissa or the ordinate of the central point of each stain area, determines the sequence that the robot sequentially passes through each stain area of the ith area group according to the ordinate or the abscissa of the central point, and enters the stain area which is closest to the stain area in the adjacent area group after the ith area group passes through the last stain area, namely the advancing track of the robot is obtained;
a cleaning track acquisition module: the control module divides each stain area into a plurality of cleaning areas in a transverse or longitudinal M equal way to obtain a plurality of cleaning areas, the width of each cleaning area is smaller than the diameter of a cleaning disc of the robot, each bisector and a rectangular sideline are cleaning tracks of the robot in the stain areas, and the cleaning tracks are horizontally arranged to the bisector adjacent to the sideline from one end of the sideline parallel to the bisector to the other end of the sideline until the cleaning tracks are arranged to the end part of the other sideline parallel to the bisector;
a cleaning module: the control module controls the robot to start to travel from the starting point of the travel track, the starting point is used as the first position of the robot, the position of the robot at the moment is obtained as the second position after the robot continues to travel for a certain distance, the actual travel direction of the robot is obtained according to the first position and the second position, if the actual travel direction deviates from the travel track, the posture of the robot is adjusted and the robot is made to travel according to a new travel track, whether the actual travel direction deviates or not is judged at intervals, and cleaning is carried out according to the cleaning track until a target stain area is reached.
The invention has the following beneficial effects:
1. before cleaning, dividing all the stain areas into N area groups according to the abscissa or ordinate of the central point of each stain area, determining the travel track of the robot, wherein the travel track is the shortest path, dividing each stain area into a plurality of cleaning areas in a transverse or longitudinal M equal way, determining the cleaning track of the robot according to the side line and the bisector of the rectangular stain area, and automatically cleaning the robot according to the travel track and the cleaning track without manually controlling a specific cleaning process, thereby releasing manpower, effectively improving the cleaning efficiency and the cleaning effect; when the cleaning area is divided, the width of the cleaning area is smaller than the diameter of the cleaning disc of the robot, so that no seam is left during cleaning, and the cleaning effect is better; and in the process of the robot, judging whether the robot deviates from the advancing track at intervals, adjusting the posture of the robot in time when the robot deviates is confirmed, and enabling the robot to advance according to the new advancing track, so that the cleaning work can be smoothly finished.
2. The boats and ships have a plurality of parts of waiting to wash, and each washing part all can wash simultaneously, improves cleaning efficiency greatly, and adjacent two wait that the robot between the washing part can communicate to avoid two robots to take place accident such as collision.
Drawings
The present invention will be described in further detail with reference to the accompanying drawings.
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a schematic of the travel and wash trajectories of the present invention.
Detailed Description
The ship cleaning device based on the machine vision comprises a control module, a plurality of pan-tilt cameras and a plurality of robots, the places, needing to be cleaned, of the whole ship are divided into a plurality of parts to be cleaned, each part to be cleaned is matched with one or two pan-tilt cameras and one robot, the pan-tilt cameras and the ship are arranged at intervals and located in front of the parts to be cleaned, the robots are required to be shot by the pan-tilt cameras, the control module is a computer, the computer is connected with the pan-tilt cameras, and photos shot by the pan-tilt cameras are fed back to the computer in real time. All parts to be cleaned are cleaned simultaneously according to the ship cleaning method based on machine vision, and two robots corresponding to two adjacent parts to be cleaned can communicate with each other to avoid accidents such as collision.
As shown in fig. 1 and 2, the ship washing method based on machine vision includes the following steps:
A. the method comprises the following steps that a pan-tilt camera acquires an initial image of a part to be cleaned of a ship and feeds the initial image back to a computer, corresponding software is installed on the computer, and a worker marks a plurality of rectangular stain areas 2 in the initial image according to the specific condition of the ship in the initial image through the software; the robot is also contained in the initial image; the software is prior art;
B. dividing all the stain areas 2 into N area groups 1 according to the abscissa of the center point of each stain area 2, determining the sequence of the robot passing through each stain area 2 of the ith area group 1 in turn according to the ordinate of the center point, entering the stain area 2 which is positioned at the end part and is closest to the stain area 2 in the (i + 1) th area group 1 after the ith area group 1 passes through the last stain area 2, and reciprocating in turn to obtain the traveling track of the robot;
the method specifically comprises the following steps:
b1, acquiring coordinates of center points of the stain areas 2, and arranging the center points according to the size of an abscissa value; the central point coordinate acquisition process is the prior art;
b2, dividing the arranged central points into N groups, namely dividing each stain area 2 into N area groups 1, wherein the abscissa of the central point of each stain area 2 in each area group 1 is in the same range, for example, the abscissa of the central point of the first group is within 0-100, and the abscissa of the central point of the second group is within 101-200;
b3, arranging the central points of the stain areas 2 in the ith area group 1 according to the magnitude of the ordinate values in sequence, namely, obtaining the partial travel track in the area group 1;
b4, the robot starts from the stain area 2 with the minimum longitudinal coordinate value in the ith area group 1, sequentially passes through each stain area 2 in the area group 1 according to a partial advancing track, enters the stain area 2 with the maximum longitudinal coordinate value at the central point in the (i + 1) th area group 1, and sequentially reciprocates to obtain the advancing track of the robot, wherein the advancing track comprises a plurality of sections of sub tracks, each section of sub track respectively corresponds to the route between two adjacent stain areas 2, and as shown in FIG. 2, the section BC is a section of sub track;
in this embodiment, grouping is performed according to the abscissa value of each central point, and sorting is performed in each area group 1 according to the ordinate of each central point, in other embodiments, grouping may be performed according to the ordinate of each central point, and sorting is performed in each area group 1 according to the abscissa of each central point;
C. dividing each stain area 2 into a plurality of cleaning areas 3 by M in the longitudinal direction to obtain a plurality of cleaning areas 3, wherein the width of each cleaning area 3 is smaller than the diameter of a cleaning disc of the robot, each bisector and a side line of each rectangular stain area 2 are cleaning tracks of the robot in each stain area 2, and the cleaning tracks start from one end of the side line parallel to the bisector, run to the other end of the side line, then horizontally run to the bisector adjacent to the side line until the cleaning tracks run to the end of the other side line parallel to the bisector, as shown in FIG. 2;
before performing the M equal division, the computer needs to calculate the ratio of the actual size of the robot to the size of the robot in the initial image, so as to determine the width of the divided cleaning area 3, i.e. determine the value of M;
because the size of each stain area 2 is not necessarily the same, and the width of the cleaning area 3 is also not necessarily the same, the M value of each stain area 2 for performing M equal division is not necessarily the same, as long as the width of the cleaning area 3 is ensured to be smaller than the diameter of the robot cleaning disc, in order to improve the cleaning efficiency, the larger the width of the cleaning area 3 is, the better the larger the width is when the width is smaller than the diameter of the cleaning disc; there may be some cases where the stain areas 2 may not be equally divided, and in this case, it is sufficient that the width of the last cleaning area 3 is smaller than the diameter of the cleaning disk except for the last cleaning area 3 which does not participate in the equally dividing;
in the present embodiment, the stain area 2 is divided into M equal parts in the vertical direction, and in other embodiments, the stain area 2 may be divided into M equal parts in the horizontal direction;
in fig. 2, the thick lines between the ADs are the travel track and the cleaning track of the robot, the travel track includes multiple segments of sub-tracks, the line segment BC in fig. 2 is a segment of sub-track, the thick lines between the AB are the cleaning track, and the line segment AB is an bisector;
D. controlling the robot to start to travel from the starting point of the travel track (namely the lower end a of the borderline of the first stain area 2 in the first group of area group 1), and taking the starting point as the first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the current position of the robot as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from the traveling track according to the actual traveling direction, judging whether the robot deviates from the adopted sub-track, if so, entering step F, and otherwise, entering step G;
the acquisition of the actual traveling direction of the robot comprises the following steps:
e1, after the robot travels a certain distance, the computer acquires a second image which is shot by the computer and contains the robot from the pan-tilt camera, and the image is input into the trained deep learning model to identify the robot; wherein a distance corresponds to 10 frames of images, i.e., the distance the robot travels within the time corresponding to the 10 frames of images; the adopted deep learning model is the prior art;
e2, judging whether the confidence of the recognition result in the step E1 reaches a preset value (the preset value is set to be 0.8), if so, entering the step E3, otherwise, entering the step E1;
e3, obtaining an ROI area according to the recognition result, and carrying out proportional transformation and Gaussian filtering on the ROI area;
e4, extracting the target feature information of the ROI processed in the step E3 to obtain at least one target feature, and taking the feature target with the minimum difference between the central position and the first position as a final target, wherein the central position of the final target is the second position; if the target feature is not found, acquiring the second image again, and identifying again;
e5, acquiring a first vector from the first position to the second position, wherein the direction of the first vector is the actual traveling direction of the robot;
setting the track direction of the section where the robot is located as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from the advancing track when the included angle theta is larger than a preset included angle threshold value; in this embodiment, the threshold value of the included angle is set to 5 ℃;
F. stopping the robot from advancing, adjusting the posture of the robot and enabling the robot to continue to advance according to a new advancing track, wherein the new advancing track is a connecting line between the stopping point and the track splitting end point in the step E5, and the step G is carried out, and the posture is adjusted to be specific that the robot is reversely rotated by theta;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering the step H; as shown in fig. 2, if the segment track where the robot is moving is a BC segment, the stain area 2 corresponding to the point C is the target stain area;
H. the robot position at this time is set to the new first position, and the process proceeds to step E.
According to the above method, it can be seen that the ship washing apparatus based on machine vision further includes:
a stain area acquisition module: acquiring an initial image of a part to be cleaned of a ship through a pan-tilt camera and feeding the initial image back to a control module, and marking a plurality of rectangular stain areas 2 in the initial image through the control module;
a travel track acquisition module: the control module divides all the stain areas 2 into N area groups 1 according to the abscissa or the ordinate of the central point of each stain area 2, determines the sequence that the robot sequentially passes through each stain area 2 of the ith area group 1 according to the ordinate or the abscissa of the central point, and enters the stain area 2 which is closest to the stain area 2 in the adjacent area group 1 after the ith area group 1 passes through the last stain area 2, namely the advancing track of the robot;
a cleaning track acquisition module: the control module divides each stain area 2 into a plurality of cleaning areas 3 in a transverse or longitudinal M equal way, the width of each cleaning area 3 is smaller than the diameter of a cleaning disc of the robot, each bisector and a rectangular sideline are cleaning tracks of the robot in the stain areas 2, the cleaning tracks start from one end of the sideline parallel to the bisector, and then horizontally move to the bisector adjacent to the sideline until the cleaning tracks move to the end part of the other sideline parallel to the bisector;
a cleaning module: the control module controls the robot to start to travel from the starting point of the travel track, the starting point is used as the first position of the robot, the position of the robot at the moment is obtained as the second position after the robot continues to travel for a certain distance, the actual travel direction of the robot is obtained according to the first position and the second position, if the actual travel direction deviates from the travel track, the posture of the robot is adjusted so that the robot continues to travel according to a new travel track, the new travel track is a connecting line between the stop point and the originally-traveling track division end point, whether the actual travel direction deviates or not is judged at intervals until the target stain area 2 is reached, and cleaning is carried out according to the cleaning track.
The above description is only a preferred embodiment of the present invention, and therefore should not be taken as limiting the scope of the invention, which is defined by the appended claims and their equivalents and modifications within the scope of the description.

Claims (10)

1. A ship cleaning method based on machine vision is characterized in that: the method comprises the following steps:
A. acquiring an initial image of a part to be cleaned of a ship, and marking a plurality of rectangular stain areas in the initial image;
B. dividing all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determining the sequence of the robot passing through each stain area of the ith area group in turn according to the ordinate or the abscissa of the center point, entering the stain area which is positioned at the end part and is closest to the stain area in the (i + 1) th area group after the ith area group passes through the last stain area, and reciprocating in turn to obtain the advancing track of the robot;
C. dividing each stain area into a plurality of cleaning areas by M in the transverse direction or the longitudinal direction, wherein the width of each cleaning area is smaller than the diameter of a cleaning disc of the robot, each bisector and a side line of each rectangular stain area are cleaning tracks of the robot in the stain areas, and the cleaning tracks are horizontally arranged to the bisector adjacent to the side line from one end of the side line parallel to the bisector to the other end of the side line until the cleaning tracks are arranged to the end part of the other side line parallel to the bisector;
D. controlling the robot to start to travel from the starting point of the travel track, and taking the starting point as a first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the current position of the robot as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from the traveling track or not according to the actual traveling direction, if so, entering step F, otherwise, entering step G;
F. stopping the robot, adjusting the posture of the robot and enabling the robot to move according to a new moving track, wherein the new moving track is a connecting line between the stopping point and the moving track end point, and entering the step G;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering the step H;
H. the robot position at this time is set to the new first position, and the process proceeds to step E.
2. The machine vision based ship cleaning method according to claim 1, characterized in that: the step E of acquiring the actual traveling direction of the robot comprises the following steps:
e1, after the robot travels a certain distance, shooting a second image containing the robot, and inputting the image into the trained deep learning model to identify the robot;
e2, judging whether the confidence of the recognition result in the step E1 reaches a preset value, if so, entering the step E3, otherwise, entering the step E1;
e3, obtaining an ROI area according to the recognition result, and carrying out proportional transformation and Gaussian filtering on the ROI area;
e4, extracting the target feature information of the ROI processed in the step E3 to obtain at least one target feature, and taking the feature target with the minimum difference between the central position and the first position as a final target, wherein the central position of the final target is the second position;
and E5, acquiring a first vector from the first position to the second position, wherein the direction of the first vector is the actual traveling direction of the robot.
3. The machine vision based ship washing method of claim 2, characterized in that: the step E of determining whether the robot deviates from the travel track includes the following steps: setting the traveling path direction of the robot as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from a traveling track when the included angle theta is larger than a preset included angle threshold value; and in the step F, the robot is reversely rotated by theta to adjust the posture.
4. A machine vision based vessel washing method according to claim 1, 2 or 3, characterized in that: the step B specifically comprises the following steps:
b1, acquiring the coordinates of the central points of the stain areas, and arranging the central points according to the size of the abscissa or ordinate values;
b2, dividing the arranged central points into N groups, namely dividing each stain area into N area groups, wherein the abscissa or ordinate value of the central point of each stain area in each area group is in the same range;
b3, arranging the central points of the stain areas in the ith area group according to the size of the ordinate or the abscissa, namely, obtaining the partial travel track in the area group;
and B4, after passing through the last stain area of the ith area group according to the partial traveling track, the robot enters a stain area which is positioned at the end part and is closest to the stain area in the (i + 1) th area group, and the robot sequentially reciprocates to form the traveling track of the robot.
5. A machine vision based vessel washing method according to claim 1, 2 or 3, characterized in that: and C, calculating the ratio of the actual size of the robot to the size of the robot in the initial image to determine the width of the cleaning area before M equal division is carried out, thereby determining the value of M.
6. A machine vision based vessel washing method according to claim 2 or 3, characterized in that: in the step E1, the distance corresponds to 10-15 frames of images.
7. A machine vision based vessel washing method according to claim 1, 2 or 3, characterized in that: and E, judging whether the actual traveling direction of the robot deviates from the traveling track, wherein the traveling track in the step B comprises a plurality of sections of sub-tracks, and each section of sub-track corresponds to a route between two stain areas.
8. A machine vision based vessel washing method according to claim 1, 2 or 3, characterized in that: the ship is provided with a plurality of parts to be cleaned, each cleaning part corresponds to one robot, the cleaning parts are cleaned according to the steps A to H, and the robots between two adjacent parts to be cleaned can communicate in the cleaning process.
9. A machine vision based vessel washing method according to claim 1, 2 or 3, characterized in that: the initial image and the second image are obtained through the pan-tilt camera, the pan-tilt camera and the ship are arranged at intervals, and images shot by the pan-tilt camera can contain the robot.
10. The utility model provides a boats and ships belt cleaning device based on machine vision which characterized in that: the system comprises a holder camera, a control module and the following modules:
a stain area acquisition module: acquiring an initial image of a part to be cleaned of a ship through a pan-tilt camera and feeding the initial image back to a control module, and marking a plurality of rectangular stain areas in the initial image through the control module;
a travel track acquisition module: the control module divides all the stain areas into N area groups according to the abscissa or the ordinate of the central point of each stain area, determines the sequence that the robot sequentially passes through each stain area of the ith area group according to the ordinate or the abscissa of the central point, and enters the stain area which is closest to the stain area in the adjacent area group after the ith area group passes through the last stain area, namely the advancing track of the robot is obtained;
a cleaning track acquisition module: the control module divides each stain area into a plurality of cleaning areas in a transverse or longitudinal M equal way to obtain a plurality of cleaning areas, the width of each cleaning area is smaller than the diameter of a cleaning disc of the robot, each bisector and a rectangular sideline are cleaning tracks of the robot in the stain areas, and the cleaning tracks are horizontally arranged to the bisector adjacent to the sideline from one end of the sideline parallel to the bisector to the other end of the sideline until the cleaning tracks are arranged to the end part of the other sideline parallel to the bisector;
a cleaning module: the control module controls the robot to start to advance from the starting point of the advancing track, the starting point is used as the first position of the robot, the position of the robot at the moment is obtained as the second position after the robot continues to advance for a certain distance, the actual advancing direction of the robot is obtained according to the first position and the second position, if the actual advancing direction deviates from the advancing track, the posture of the robot is adjusted, the robot advances according to a new advancing track, whether the actual advancing direction deviates at intervals is judged until a target stain area is reached, and cleaning is carried out according to the cleaning track.
CN202111260358.4A 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision Active CN114162279B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111260358.4A CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111260358.4A CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Publications (2)

Publication Number Publication Date
CN114162279A true CN114162279A (en) 2022-03-11
CN114162279B CN114162279B (en) 2023-12-19

Family

ID=80477710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111260358.4A Active CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Country Status (1)

Country Link
CN (1) CN114162279B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120006352A1 (en) * 2009-11-23 2012-01-12 Searobotics Corporation Robotic submersible cleaning system
WO2013013679A1 (en) * 2011-07-27 2013-01-31 Mms Equipment A/S A ship hull cleaning system for removing fouling
CN104176208A (en) * 2014-07-17 2014-12-03 江苏南通申通机械有限公司 Sea creature killing device for ship and sea creature stacking image recognition system based on MATLAB (matrix laboratory)
CN109263828A (en) * 2018-11-06 2019-01-25 高溯 Method and apparatus for being cleaned to navigating ship in ocean
CN209241278U (en) * 2018-11-06 2019-08-13 高溯 Equipment for being cleaned to navigating ship in ocean
CN110182332A (en) * 2019-05-17 2019-08-30 清华大学 Combined-type climbs wall type ship Intelligent Laser rust removalling equipment
CN110525604A (en) * 2019-06-12 2019-12-03 西湖大学 A kind of ship wall region piecemeal cleaning method, device, equipment and storage medium
CN111232150A (en) * 2020-01-16 2020-06-05 中国海洋大学 Hull wall surface cleaning system and cleaning operation method
CN111605676A (en) * 2020-06-12 2020-09-01 中国海洋大学 Ship cleaning robot and cleaning method
CN112407179A (en) * 2020-12-10 2021-02-26 江苏科技大学 Underwater cleaning device for marine equipment and cleaning control method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120006352A1 (en) * 2009-11-23 2012-01-12 Searobotics Corporation Robotic submersible cleaning system
WO2013013679A1 (en) * 2011-07-27 2013-01-31 Mms Equipment A/S A ship hull cleaning system for removing fouling
CN104176208A (en) * 2014-07-17 2014-12-03 江苏南通申通机械有限公司 Sea creature killing device for ship and sea creature stacking image recognition system based on MATLAB (matrix laboratory)
CN109263828A (en) * 2018-11-06 2019-01-25 高溯 Method and apparatus for being cleaned to navigating ship in ocean
CN209241278U (en) * 2018-11-06 2019-08-13 高溯 Equipment for being cleaned to navigating ship in ocean
CN110182332A (en) * 2019-05-17 2019-08-30 清华大学 Combined-type climbs wall type ship Intelligent Laser rust removalling equipment
CN110525604A (en) * 2019-06-12 2019-12-03 西湖大学 A kind of ship wall region piecemeal cleaning method, device, equipment and storage medium
CN111232150A (en) * 2020-01-16 2020-06-05 中国海洋大学 Hull wall surface cleaning system and cleaning operation method
CN111605676A (en) * 2020-06-12 2020-09-01 中国海洋大学 Ship cleaning robot and cleaning method
CN112407179A (en) * 2020-12-10 2021-02-26 江苏科技大学 Underwater cleaning device for marine equipment and cleaning control method thereof

Also Published As

Publication number Publication date
CN114162279B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN109365318B (en) Multi-robot cooperation sorting method and system
CN109731708B (en) Automatic paint spraying method for automobile maintenance based on image recognition
CN106425181A (en) Curve welding seam welding technology based on line structured light
CN108445880A (en) The autonomous mooring system of unmanned boat and method merged based on monocular vision and laser data
KR101454855B1 (en) Ship hull inspection and analysys system, and method thereof
CN110281231B (en) Three-dimensional vision grabbing method for mobile robot for unmanned FDM additive manufacturing
CN103324198A (en) Automatic card collecting and positioning booting system based on computer vision technology and application method thereof
CN106647758A (en) Target object detection method and device and automatic guiding vehicle following method
CN109993788B (en) Deviation rectifying method, device and system for tyre crane
CN112319429B (en) Control system and control method for automatically cleaning front windshield of bullet train
CN112319464A (en) Automatic parking method, device, equipment and storage medium
JPH02256430A (en) Automatic assembly device and method
CN110738668B (en) Method and system for intelligently controlling high beam and vehicle
CN114162279A (en) Ship cleaning method and device based on machine vision
CN107186476A (en) Intelligent mobile process line and its processing method
CN116312014B (en) Intelligent navigation method for parking lot
CN116539614A (en) Vertical skip lining plate damage detection method based on machine vision
CN213196231U (en) Welding seam tracking robot
CN114952901A (en) Intelligent control system of vehicle cleaning robot
CN113715935A (en) Automatic assembling system and automatic assembling method for automobile windshield
CN114384080A (en) Batch detection method and system for rubber ring defects
CN113298727A (en) Underground auxiliary transport vehicle navigation system and method based on multiple identification lines
CN109241567B (en) Carrier-based aircraft automatic scheduling semi-physical simulation system and method
Bae et al. Prototyping a system of cost-effective autonomous guided vehicles
CN105335961B (en) A kind of filter bag pastes clamp method automatically

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant