CN114162279B - Ship cleaning method and device based on machine vision - Google Patents

Ship cleaning method and device based on machine vision Download PDF

Info

Publication number
CN114162279B
CN114162279B CN202111260358.4A CN202111260358A CN114162279B CN 114162279 B CN114162279 B CN 114162279B CN 202111260358 A CN202111260358 A CN 202111260358A CN 114162279 B CN114162279 B CN 114162279B
Authority
CN
China
Prior art keywords
robot
cleaning
stain
area
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111260358.4A
Other languages
Chinese (zh)
Other versions
CN114162279A (en
Inventor
陈钟恭
郑华锋
张飞檐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Weibai Industrial Robot Co ltd
Original Assignee
Fujian Weibai Industrial Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Weibai Industrial Robot Co ltd filed Critical Fujian Weibai Industrial Robot Co ltd
Priority to CN202111260358.4A priority Critical patent/CN114162279B/en
Publication of CN114162279A publication Critical patent/CN114162279A/en
Application granted granted Critical
Publication of CN114162279B publication Critical patent/CN114162279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B59/00Hull protection specially adapted for vessels; Cleaning devices specially adapted for vessels
    • B63B59/06Cleaning devices for hulls

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a ship cleaning method and device based on machine vision, wherein the method comprises the following steps: acquiring a plurality of rectangular stain areas; obtaining the advancing track of the robot according to the horizontal coordinate or the vertical coordinate of the central point of each stain area; dividing each stain area by M equally to obtain a plurality of cleaning areas, and obtaining cleaning tracks according to the bisectors and the edges; judging whether the robot deviates from the travelling track every time the robot travels for a certain distance, and if so, adjusting the gesture of the robot to enable the robot to return to the travelling track; and after reaching the target stain area, cleaning according to the cleaning track. According to the invention, a specific cleaning process is not required to be manually controlled, and the cleaning efficiency and the cleaning effect are effectively improved.

Description

Ship cleaning method and device based on machine vision
Technical Field
The invention relates to a ship cleaning method and device based on machine vision.
Background
A ship is a vehicle that can navigate or moor in a body of water for transportation or operation, with different technical properties, equipment and structures according to different usage requirements. In the use process of the ship, the outer plate is soaked in water for a long time, so that the outer plate is easily influenced by mud and microorganisms, a lot of dirt and organisms are attached to the surface of the outer plate, even the outer plate is rusted, the ship body structure is damaged, and therefore the ship needs to be cleaned irregularly.
Because of the special shape and structure of the ship, the robot is generally used for cleaning, but the existing robot needs manual control to complete the cleaning task, so that the time and the labor are consumed, and the cleaning efficiency is low.
Disclosure of Invention
The invention provides a ship cleaning method and device based on machine vision, wherein a robot automatically cleans according to a travelling track and a cleaning track, a specific cleaning process is not required to be manually controlled, and the cleaning efficiency and the cleaning effect are effectively improved.
The invention is realized by the following technical scheme:
a ship cleaning method based on machine vision comprises the following steps:
A. acquiring an initial image of a part to be cleaned of a ship, and marking a plurality of rectangular stain areas in the initial image;
B. dividing all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determining the sequence of the robot passing through each stain area of the ith area group according to the ordinate or the abscissa of the center point, entering the stain area which is positioned at the end part and is closest to the stain area in the (i+1) th area group after the ith area group passes through the last stain area, and sequentially reciprocating to obtain the travelling track of the robot;
C. dividing each stain area into a plurality of cleaning areas transversely or longitudinally by M equally, wherein the width of each cleaning area is smaller than the diameter of a robot cleaning disc, each bisector and the edge of the rectangular stain area are cleaning tracks of the robot in the stain area, and the cleaning tracks start from one end of an edge parallel to the bisector, run to the other end of the edge, then run to the bisector adjacent to the edge horizontally, and run to the end of another edge parallel to the bisector;
D. controlling the robot to start traveling from the starting point of the traveling track, and taking the starting point as a first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the position of the robot at the moment as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from a traveling track according to the actual traveling direction, if so, entering a step F, otherwise, entering a step G;
F. stopping the robot to advance, adjusting the gesture of the robot and making the robot advance according to a new advancing track, wherein the new advancing track is a connecting line between the stopping point and the advancing track end point, and entering the step G;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering a step H;
H. the position of the robot at this time is set to the new first position, and step E is entered.
Further, the step E of obtaining the actual traveling direction of the robot includes the following steps:
e1, after the robot travels a certain distance, shooting a second image containing the robot, and inputting the image into a trained deep learning model to identify the robot;
e2, judging whether the confidence coefficient of the identification result in the step E1 reaches a preset value, if so, entering a step E3, otherwise, entering the step E1;
e3, acquiring an ROI region according to the identification result, and performing proportional conversion and Gaussian filtering on the ROI region;
e4, extracting target feature information of the ROI area processed in the step E3 to obtain at least one target feature, and taking a feature target with the smallest difference between the central position and the first position as a final target, wherein the central position of the final target is the second position;
and E5, acquiring a first vector from the first position to the second position, wherein the first vector direction is the actual travelling direction of the robot.
Further, the step E of determining whether the robot deviates from the travel track includes the following steps: setting the direction of a robot travelling path as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from the travelling path when the included angle theta is larger than a preset included angle threshold value; in the step F, the robot is reversely rotated by theta to adjust the posture.
Further, the step B specifically includes:
b1, acquiring the coordinates of central points of each stain area, and arranging the central points according to the sizes of horizontal coordinates or vertical coordinates;
b2, dividing the arranged central points into N groups, namely dividing each stain region into N region groups, wherein the abscissa or ordinate of the central point of each stain region in each region group is in the same range;
b3, arranging the center points of the stain areas in the ith area group according to the size of an ordinate or an abscissa value, namely, a part of the advancing track in the area group;
and B4, after the robot passes through the last stain area of the ith area group according to part of the travelling track, entering the stain area which is positioned at the end part of the (i+1) th area group and is closest to the stain area, and sequentially reciprocating to obtain the travelling track of the robot.
Further, the initial image includes a robot, and in the step C, a ratio of an actual size of the robot to a size of the robot in the initial image is calculated before the M-aliquoting is performed to determine a width of the cleaning region, thereby determining a value of M.
Further, in the step E1, the distance corresponds to 10-15 frames of images.
Further, in the step E, it is determined whether the actual traveling direction of the robot deviates from the traveling track, and the traveling track is a certain segment of the sub-track.
Furthermore, the ship is provided with a plurality of parts to be cleaned, each cleaning part corresponds to one robot, and cleaning is carried out according to the steps A to H aiming at each cleaning part, and in the cleaning process, the robots between two adjacent parts to be cleaned can communicate.
Further, the initial image and the second image are obtained through the cradle head camera, the cradle head camera and the ship are arranged at intervals, and the images shot by the cradle head camera can comprise the robot.
The invention is also realized by the following technical scheme:
the utility model provides a ship cleaning device based on machine vision, includes cloud platform camera, control module and following module:
stain region acquisition module: acquiring an initial image of a part to be cleaned of the ship through a pan-tilt camera, feeding back the initial image to a control module, and marking a plurality of rectangular stain areas at the position in the initial image through the control module;
the traveling track acquisition module is used for: the control module divides all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determines the sequence of the robot passing through each stain area of the ith area group in sequence according to the ordinate or the abscissa of the center point, and enters the stain area closest to the stain area in the adjacent area group after passing through the last stain area in the ith area group, so as to obtain the advancing track of the robot;
the cleaning track acquisition module is used for: the control module divides each stain area into a plurality of cleaning areas in a transverse or longitudinal mode, the width of each cleaning area is smaller than the diameter of a robot cleaning disc, each bisector and rectangular side lines are cleaning tracks of the robot in the stain areas, the cleaning tracks start from one end of one side line parallel to the bisector, travel to the other end of the side line, then travel to the bisector adjacent to the side line in a horizontal mode, and travel to the end of the other side line parallel to the bisector;
and (3) a cleaning module: the control module controls the robot to start traveling from the starting point of the traveling track, takes the starting point as a first position of the robot, acquires the position of the robot at the moment as a second position after the robot continues traveling for a certain distance, obtains the actual traveling direction of the robot according to the first position and the second position, adjusts the gesture of the robot and enables the robot to travel according to the new traveling track if the actual traveling direction deviates from the traveling track, judges whether the actual traveling direction deviates at intervals of the distance until the target stain area is reached, and then cleans according to the cleaning track.
The invention has the following beneficial effects:
1. before cleaning, dividing all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, and determining the advancing track of the robot according to the ordinate or the abscissa of the center point of each stain area in each area group, wherein the advancing track is the shortest path, and then dividing each stain area into M parts transversely or longitudinally to obtain a plurality of cleaning areas, and determining the cleaning track of the robot according to the edge line and the bisector of the rectangular stain area, wherein the robot can automatically clean according to the advancing track and the cleaning track without manual control of a specific cleaning process, thereby saving manpower, effectively improving the cleaning efficiency and the cleaning effect, calculating the advancing track and the cleaning track, having small operation amount and not increasing excessive operation burden; when the cleaning area is divided, the width of the cleaning area is smaller than the diameter of the robot cleaning disc, so that no gap is reserved in cleaning, and the cleaning effect is better; in the running process of the robot, whether the robot deviates from the running track or not is judged at intervals, and the gesture of the robot is timely adjusted when the deviation is confirmed and the robot runs according to the new running track, so that the cleaning work can be successfully completed.
2. The ship is provided with a plurality of parts to be cleaned, each cleaning part can be cleaned simultaneously, the cleaning efficiency is greatly improved, and robots between two adjacent parts to be cleaned can communicate, so that accidents such as collision of two robots can be avoided.
Drawings
The invention is described in further detail below with reference to the accompanying drawings.
FIG. 1 is a flow chart of the present invention.
FIG. 2 is a schematic illustration of the travel path and the cleaning path of the present invention.
Detailed Description
The ship cleaning device based on machine vision comprises a control module, a plurality of cradle head cameras and a plurality of robots, wherein a place where the whole ship needs to be cleaned is divided into a plurality of parts to be cleaned, each part to be cleaned is matched with one or two cradle head cameras and one robot, the cradle head cameras are arranged at intervals with the ship and are positioned in front of the parts to be cleaned, the robots can be required to be shot, the control module is a computer, the computer is connected with the cradle head cameras, and photos shot by the cradle head cameras are fed back to the computer in real time. Each part to be cleaned is cleaned simultaneously according to a ship cleaning method based on machine vision as described below, and two robots corresponding to two adjacent parts to be cleaned can communicate with each other so as to avoid accidents such as collision.
As shown in fig. 1 and 2, the machine vision-based ship cleaning method includes the following steps:
A. the cradle head camera acquires an initial image of a ship part to be cleaned and feeds the initial image back to the computer, corresponding software is installed on the computer, and a worker marks a plurality of rectangular stain areas 2 in the initial image according to the specific condition of the ship in the initial image through the software; the initial image also contains robots; the software is the prior art;
B. dividing all the stain areas 2 into N area groups 1 according to the abscissa of the center point of each stain area 2, determining the sequence of the robot passing through each stain area 2 of the ith area group 1 according to the ordinate of the center point, entering the stain area 2 which is positioned at the end part and closest to the stain area 2 in the (i+1) th area group 1 after the ith area group 1 passes through the last stain area 2, and sequentially reciprocating to obtain the travelling track of the robot;
the method specifically comprises the following steps:
b1, acquiring the coordinates of the central points of each stain area 2, and arranging the central points according to the sizes of the abscissa values; the center point coordinate acquisition process is the prior art;
b2, dividing the arranged central points into N groups, namely dividing each stain area 2 into N area groups 1, wherein the abscissa of the central points of each stain area 2 in each area group 1 is in the same range, for example, the abscissa value of the central points of the first group is in the range of 0-100, and the abscissa value of the central points of the second group is in the range of 101-200;
b3, the center points of the stain areas 2 in the ith area group 1 are sequentially arranged according to the size of the ordinate value, namely, the partial advancing tracks in the area group 1;
b4, starting from the stain region 2 with the smallest longitudinal coordinate value in the ith region group 1, enabling the robot to sequentially pass through each stain region 2 in the ith region group 1 according to part of travelling tracks, and then entering the stain region 2 with the largest longitudinal coordinate value of the center point in the (i+1) th region group 1, and sequentially reciprocating to obtain the travelling track of the robot, wherein the travelling track comprises a plurality of sections of sub-tracks, each section of sub-track corresponds to a route between two adjacent stain regions 2 respectively, and the line segment BC is a section of sub-track as shown in figure 2;
in this embodiment, the grouping is performed according to the abscissa value of each center point, and the sorting is performed according to the ordinate value of each center point in each region group 1, and in other embodiments, the grouping may be performed according to the ordinate value of each center point, and the sorting is performed according to the abscissa value of each center point in each region group 1;
C. dividing each stain area 2 by M in the longitudinal direction to obtain a plurality of cleaning areas 3, wherein the width of each cleaning area 3 is smaller than the diameter of a robot cleaning disc, each bisector and the edge line of the rectangular stain area 2 are the cleaning track of the robot in the stain area 2, and the cleaning track starts from one end of an edge line parallel to the bisector, runs to the other end of the edge line, then runs horizontally to the bisector adjacent to the edge line, and runs to the end of another edge line parallel to the bisector, as shown in fig. 2;
before performing the M-aliquoting, the computer needs to calculate the ratio of the actual size of the robot to the size of the robot in the initial image, so as to determine the width of the divided cleaning area 3, that is, determine the value of M;
since the size of each of the stain areas 2 is not necessarily the same, and the width of the cleaning area 3 is not necessarily the same, the value of M in which each of the stain areas 2 is equally divided by M is not necessarily the same, and it is only necessary to ensure that the width of the cleaning area 3 is smaller than the diameter of the robot cleaning tray, and in order to improve the cleaning efficiency, the larger the width of the cleaning area 3 is, the better the smaller the diameter of the cleaning tray is; there may be some cases where the stain area 2 cannot be equally divided, at this time, as long as the last washing area 3 does not participate in the equally dividing, and the width of the last washing area 3 is also smaller than the diameter of the washing tray;
in the present embodiment, the stain region 2 is divided into M equal parts in the longitudinal direction, and in other embodiments, the stain region 2 may be divided into M equal parts in the transverse direction;
in fig. 2, the thick lines between the ADs are the travelling track and the cleaning track of the robot, the travelling track comprises a plurality of sections of split tracks, the line segment BC in fig. 2 is a section of split track, the thick lines between the AB are the cleaning tracks, and the line segment AB is a bisector;
D. controlling the robot to start traveling from the starting point of the traveling track (namely, the lower end a of the side line of the first stain area 2 in the first group of areas 1), and taking the starting point as a first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the position of the robot at the moment as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from a traveling track according to the actual traveling direction, judging whether the robot deviates from the adopted sub-track at the moment, if yes, entering a step F, otherwise, entering a step G;
the acquisition of the actual travelling direction of the robot comprises the following steps:
e1, after the robot advances for a certain distance, the computer acquires a second image containing the robot, which is shot by the computer, from the tripod head camera, and the second image is input into a trained deep learning model to identify the robot; wherein, a distance corresponds to 10 frames of images, namely, the distance traveled by the robot in the time corresponding to the 10 frames of images; the deep learning model is adopted as the prior art;
e2, judging whether the confidence coefficient of the identification result in the step E1 reaches a preset value (the preset value is set to be 0.8), if so, entering a step E3, otherwise, entering the step E1;
e3, acquiring an ROI region according to the identification result, and performing proportional conversion and Gaussian filtering on the ROI region;
e4, extracting target feature information of the ROI area processed in the step E3 to obtain at least one target feature, and taking a feature target with the smallest difference between the central position and the first position as a final target, wherein the central position of the final target is the second position; if the target feature is not found, the second image is acquired again, and recognition is carried out again;
e5, acquiring a first vector from the first position to the second position, wherein the first vector direction is the actual travelling direction of the robot;
setting the track dividing direction of the section where the robot is located as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from the travelling track when the included angle theta is larger than a preset included angle threshold value; in this embodiment, the included angle threshold is set to 5 ℃;
F. stopping the robot, adjusting the gesture of the robot, and enabling the robot to continue to travel according to a new travel track, wherein the new travel track is a connecting line between the stopping point and the track dividing end point in the step E5, and entering the step G, wherein the gesture adjustment is specifically to reversely rotate the robot by theta;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering a step H; as shown in fig. 2, if the segment track of the robot is BC line segment, the stain area 2 corresponding to the point C is the target stain area;
H. the position of the robot at this time is set to the new first position, and step E is entered.
According to the above method, it is known that the machine vision-based ship cleaning device further includes:
stain region acquisition module: acquiring an initial image of a part to be cleaned of the ship through a pan-tilt camera, feeding back the initial image to a control module, and marking a plurality of rectangular stain areas 2 at positions in the initial image through the control module;
the traveling track acquisition module is used for: the control module divides all the stain areas 2 into N area groups 1 according to the abscissa or the ordinate of the center point of each stain area 2, determines the sequence of the robot passing through each stain area 2 of the ith area group 1 in turn according to the ordinate or the abscissa of the center point, and enters the stain area 2 closest to the stain area 2 in the adjacent area group 1 after passing through the last stain area 2 in the ith area group 1, so as to obtain the travelling track of the robot;
the cleaning track acquisition module is used for: the control module divides each stain area 2 into a plurality of cleaning areas 3 in a transverse or longitudinal M equal mode, the width of each cleaning area 3 is smaller than the diameter of a robot cleaning disc, each equal line and a rectangular edge are cleaning tracks of the robot in the stain area 2, the cleaning tracks start from one end of one edge parallel to the equal line, run to the other end of the edge, then run to an equal line adjacent to the edge horizontally, and run to the end of the other edge parallel to the equal line;
and (3) a cleaning module: the control module controls the robot to start traveling from the starting point of the traveling track, takes the starting point as the first position of the robot, acquires the position of the robot at the moment as the second position after the robot continues traveling for a certain distance, obtains the actual traveling direction of the robot according to the first position and the second position, adjusts the gesture of the robot to enable the robot to continue traveling according to a new traveling track if the actual traveling direction deviates from the traveling track, namely, a connecting line between the stopping point and the original traveling sub-track end point, judges whether the actual traveling direction deviates at intervals of distance, and cleans according to the cleaning track after the target stain area 2 is reached.
The foregoing description is only illustrative of the preferred embodiments of the present invention and is not to be construed as limiting the scope of the invention, i.e., the invention is not to be limited to the details of the claims and the description, but rather is to cover all modifications which are within the scope of the invention.

Claims (10)

1. A ship cleaning method based on machine vision is characterized in that: the method comprises the following steps:
A. acquiring an initial image of a part to be cleaned of a ship, and marking a plurality of rectangular stain areas in the initial image;
B. dividing all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determining the sequence of the robot passing through each stain area of the ith area group according to the ordinate or the abscissa of the center point, entering the stain area which is positioned at the end part and is closest to the stain area in the (i+1) th area group after the ith area group passes through the last stain area, and sequentially reciprocating to obtain the travelling track of the robot; wherein i is more than or equal to 1 and less than or equal to N-1;
C. dividing each stain area into a plurality of cleaning areas transversely or longitudinally by M equally, wherein the width of each cleaning area is smaller than the diameter of a robot cleaning disc, each bisector and the edge of the rectangular stain area are cleaning tracks of the robot in the stain area, and the cleaning tracks start from one end of an edge parallel to the bisector, run to the other end of the edge, then run to the bisector adjacent to the edge horizontally, and run to the end of another edge parallel to the bisector;
D. controlling the robot to start traveling from the starting point of the traveling track, and taking the starting point as a first position of the robot;
E. after the robot continues to travel for a certain distance, acquiring the position of the robot at the moment as a second position, acquiring the actual traveling direction of the robot according to the first position and the second position, judging whether the robot deviates from a traveling track according to the actual traveling direction, if so, entering a step F, otherwise, entering a step G;
F. stopping the robot to advance, adjusting the gesture of the robot and making the robot advance according to a new advancing track, wherein the new advancing track is a connecting line between the stopping point and the advancing track end point, and entering the step G;
G. judging whether the target stain area is reached, if so, cleaning according to the cleaning track, otherwise, entering a step H;
H. the position of the robot at this time is set to the new first position, and step E is entered.
2. The machine vision based marine cleaning method of claim 1, wherein: the step E of acquiring the actual travelling direction of the robot comprises the following steps:
e1, after the robot travels a certain distance, shooting a second image containing the robot, and inputting the image into a trained deep learning model to identify the robot;
e2, judging whether the confidence coefficient of the identification result in the step E1 reaches a preset value, if so, entering a step E3, otherwise, entering the step E1;
e3, acquiring an ROI region according to the identification result, and performing proportional conversion and Gaussian filtering on the ROI region;
e4, extracting target feature information of the ROI area processed in the step E3 to obtain at least one target feature, and taking a feature target with the smallest difference between the central position and the first position as a final target, wherein the central position of the final target is the second position;
and E5, acquiring a first vector from the first position to the second position, wherein the first vector direction is the actual travelling direction of the robot.
3. A machine vision based marine cleaning method as claimed in claim 2, wherein: the step E of judging whether the robot deviates from the travelling track comprises the following steps: setting the direction of a robot travelling path as a second vector, calculating an included angle theta between the first vector and the second vector, and judging that the robot deviates from the travelling path when the included angle theta is larger than a preset included angle threshold value; in the step F, the robot is reversely rotated by theta to adjust the posture.
4. A machine vision based marine cleaning method according to claim 1 or 2 or 3, characterized in that: the step B specifically comprises the following steps:
b1, acquiring the coordinates of central points of each stain area, and arranging the central points according to the sizes of horizontal coordinates or vertical coordinates;
b2, dividing the arranged central points into N groups, namely dividing each stain region into N region groups, wherein the abscissa or ordinate of the central point of each stain region in each region group is in the same range;
b3, arranging the center points of the stain areas in the ith area group according to the size of an ordinate or an abscissa value, namely, a part of the advancing track in the area group;
and B4, after the robot passes through the last stain area of the ith area group according to part of the travelling track, entering the stain area which is positioned at the end part of the (i+1) th area group and is closest to the stain area, and sequentially reciprocating to obtain the travelling track of the robot.
5. A machine vision based marine cleaning method according to claim 1 or 2 or 3, characterized in that: the initial image includes a robot, and in the step C, a ratio of an actual size of the robot to a size of the robot in the initial image is calculated before the M-aliquoting is performed to determine a width of the cleaning region, thereby determining a value of M.
6. A machine vision based marine cleaning method according to claim 2 or 3, characterized in that: in the step E1, the distance corresponds to 10-15 frames of images.
7. A machine vision based marine cleaning method according to claim 1 or 2 or 3, characterized in that: in the step E, it is determined whether the actual traveling direction of the robot deviates from the traveling track, and the traveling track is a certain segment of the sub-track.
8. A machine vision based marine cleaning method according to claim 1 or 2 or 3, characterized in that: the ship is provided with a plurality of parts to be cleaned, each cleaning part corresponds to one robot, and the cleaning is carried out according to the steps A to H aiming at each cleaning part, and in the cleaning process, the robots between two adjacent parts to be cleaned can communicate.
9. A machine vision based marine cleaning method according to claim 1 or 2 or 3, characterized in that: the initial image and the second image are obtained through the cradle head camera, the cradle head camera and the ship are arranged at intervals, and the images shot by the cradle head camera can comprise robots.
10. Ship cleaning device based on machine vision, its characterized in that: the camera comprises a tripod head camera, a control module and the following modules:
stain region acquisition module: acquiring an initial image of a part to be cleaned of the ship through a pan-tilt camera, feeding back the initial image to a control module, and marking a plurality of rectangular stain areas at the position in the initial image through the control module;
the traveling track acquisition module is used for: the control module divides all the stain areas into N area groups according to the abscissa or the ordinate of the center point of each stain area, determines the sequence of the robot passing through each stain area of the ith area group in sequence according to the ordinate or the abscissa of the center point, and enters the stain area closest to the stain area in the adjacent area group after passing through the last stain area in the ith area group, so as to obtain the advancing track of the robot; wherein i is more than or equal to 1 and less than or equal to N-1;
the cleaning track acquisition module is used for: the control module divides each stain area into a plurality of cleaning areas in a transverse or longitudinal mode, the width of each cleaning area is smaller than the diameter of a robot cleaning disc, each bisector and rectangular side lines are cleaning tracks of the robot in the stain areas, the cleaning tracks start from one end of one side line parallel to the bisector, travel to the other end of the side line, then travel to the bisector adjacent to the side line in a horizontal mode, and travel to the end of the other side line parallel to the bisector;
and (3) a cleaning module: the control module controls the robot to start traveling from the starting point of the traveling track, takes the starting point as a first position of the robot, acquires the position of the robot at the moment as a second position after the robot continues traveling for a certain distance, obtains the actual traveling direction of the robot according to the first position and the second position, adjusts the gesture of the robot and travels according to the new traveling track if the actual traveling direction deviates from the traveling track, judges whether the actual traveling direction deviates at intervals of a certain distance until the target stain area is reached, and then cleans according to the cleaning track.
CN202111260358.4A 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision Active CN114162279B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111260358.4A CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111260358.4A CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Publications (2)

Publication Number Publication Date
CN114162279A CN114162279A (en) 2022-03-11
CN114162279B true CN114162279B (en) 2023-12-19

Family

ID=80477710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111260358.4A Active CN114162279B (en) 2021-10-28 2021-10-28 Ship cleaning method and device based on machine vision

Country Status (1)

Country Link
CN (1) CN114162279B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013013679A1 (en) * 2011-07-27 2013-01-31 Mms Equipment A/S A ship hull cleaning system for removing fouling
CN104176208A (en) * 2014-07-17 2014-12-03 江苏南通申通机械有限公司 Sea creature killing device for ship and sea creature stacking image recognition system based on MATLAB (matrix laboratory)
CN109263828A (en) * 2018-11-06 2019-01-25 高溯 Method and apparatus for being cleaned to navigating ship in ocean
CN209241278U (en) * 2018-11-06 2019-08-13 高溯 Equipment for being cleaned to navigating ship in ocean
CN110182332A (en) * 2019-05-17 2019-08-30 清华大学 Combined-type climbs wall type ship Intelligent Laser rust removalling equipment
CN110525604A (en) * 2019-06-12 2019-12-03 西湖大学 A kind of ship wall region piecemeal cleaning method, device, equipment and storage medium
CN111232150A (en) * 2020-01-16 2020-06-05 中国海洋大学 Hull wall surface cleaning system and cleaning operation method
CN111605676A (en) * 2020-06-12 2020-09-01 中国海洋大学 Ship cleaning robot and cleaning method
CN112407179A (en) * 2020-12-10 2021-02-26 江苏科技大学 Underwater cleaning device for marine equipment and cleaning control method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8506719B2 (en) * 2009-11-23 2013-08-13 Searobotics Corporation Robotic submersible cleaning system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013013679A1 (en) * 2011-07-27 2013-01-31 Mms Equipment A/S A ship hull cleaning system for removing fouling
CN104176208A (en) * 2014-07-17 2014-12-03 江苏南通申通机械有限公司 Sea creature killing device for ship and sea creature stacking image recognition system based on MATLAB (matrix laboratory)
CN109263828A (en) * 2018-11-06 2019-01-25 高溯 Method and apparatus for being cleaned to navigating ship in ocean
CN209241278U (en) * 2018-11-06 2019-08-13 高溯 Equipment for being cleaned to navigating ship in ocean
CN110182332A (en) * 2019-05-17 2019-08-30 清华大学 Combined-type climbs wall type ship Intelligent Laser rust removalling equipment
CN110525604A (en) * 2019-06-12 2019-12-03 西湖大学 A kind of ship wall region piecemeal cleaning method, device, equipment and storage medium
CN111232150A (en) * 2020-01-16 2020-06-05 中国海洋大学 Hull wall surface cleaning system and cleaning operation method
CN111605676A (en) * 2020-06-12 2020-09-01 中国海洋大学 Ship cleaning robot and cleaning method
CN112407179A (en) * 2020-12-10 2021-02-26 江苏科技大学 Underwater cleaning device for marine equipment and cleaning control method thereof

Also Published As

Publication number Publication date
CN114162279A (en) 2022-03-11

Similar Documents

Publication Publication Date Title
CN107798330B (en) Weld image feature information extraction method
CN109365318B (en) Multi-robot cooperation sorting method and system
CN109226967B (en) Active laser vision steady weld joint tracking system for laser-arc hybrid welding
CN109604830B (en) Accurate welding seam tracking system for laser welding of active laser vision guiding robot
CN106425181A (en) Curve welding seam welding technology based on line structured light
CN106462968B (en) Method and device for calibrating a camera system of a motor vehicle
CN105511462B (en) A kind of AGV air navigation aids of view-based access control model
CN108637435A (en) A kind of three-dimensional seam tracking system and method for view-based access control model and arc voltage sensing
CN112079154B (en) Paper-plastic composite bag differential speed deviation rectifying method and system based on visual positioning
CN111563412B (en) Rapid lane line detection method based on parameter space voting and Bessel fitting
CN107609486A (en) To anti-collision early warning method and system before a kind of vehicle
CN112102369A (en) Autonomous inspection method, device and equipment for water surface floating target and storage medium
CN108873914B (en) Robot autonomous navigation system and method based on depth image data
CN112529858A (en) Welding seam image processing method based on machine vision
KR101454855B1 (en) Ship hull inspection and analysys system, and method thereof
CN102096927A (en) Target tracking method of independent forestry robot
CN107305632A (en) Destination object distance measurement method and system based on monocular computer vision technique
CN105225225A (en) A kind of leather system for automatic marker making method and apparatus based on machine vision
CN106584451A (en) Visual navigation based transformer substation automatic composition robot and method
KR101974602B1 (en) Lane repainting apparatus and method
GB2596475A (en) System and method for surface feature detection and traversal
CN108362296A (en) The end visual guidance method of AUV and connection station underwater mating
CN111598846A (en) Rail defect detection method in tunnel based on YOLO
CN114162279B (en) Ship cleaning method and device based on machine vision
US11904843B2 (en) Autonomous parking systems and methods for vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant