CN114648581A - Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader - Google Patents

Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader Download PDF

Info

Publication number
CN114648581A
CN114648581A CN202210326186.4A CN202210326186A CN114648581A CN 114648581 A CN114648581 A CN 114648581A CN 202210326186 A CN202210326186 A CN 202210326186A CN 114648581 A CN114648581 A CN 114648581A
Authority
CN
China
Prior art keywords
grab bucket
point cloud
cloud picture
grab
ship unloader
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210326186.4A
Other languages
Chinese (zh)
Inventor
凌杰
肖自立
徐健
叶飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Ltd China
Original Assignee
Siemens Ltd China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Ltd China filed Critical Siemens Ltd China
Priority to CN202210326186.4A priority Critical patent/CN114648581A/en
Publication of CN114648581A publication Critical patent/CN114648581A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses a method, a device, a system, a medium and a program product for detecting a grab bucket of a ship unloader. The method comprises the following steps: acquiring a first point cloud picture of a cabin containing materials when the grab bucket is not in the process of material grabbing operation; acquiring a second point cloud picture of the cabin when the grab bucket is in the process of material grabbing operation; determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture; and detecting the position of the grab bucket in the grabbing operation process based on the third cloud picture. The embodiment of the invention can accurately detect the position of the grab bucket, does not need to transform the grab bucket and reduces the implementation difficulty.

Description

Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader
Technical Field
The embodiment of the invention relates to the technical field of ship unloaders, in particular to a method, a device, a system, a medium and a program product for detecting a grab bucket of a ship unloader.
Background
The ship unloader is a special machine which is made of a continuous conveying machine and can lift a machine head of bulk materials, or has the self-taking capability, or is matched with a taking and feeding device to continuously lift the bulk materials out of a cabin, then unloads the bulk materials to an arm support or a rack and can convey the bulk materials to a local conveyor system of a main conveying system at the bank. The ship unloader can improve unloading efficiency and reduce dust pollution.
The grab bucket is the final actuator of the ship unloader. In the operation process of the ship unloader, the control system only controls the grab bucket by depending on the tension of the steel wire rope, so that the grab bucket can swing to different degrees in the lifting process and the trolley walking process, and the position of the grab bucket changes at any time. In full-automatic and semi-automatic control systems, real-time accurate positioning of the position of a grab bucket is a key control link. After the real-time grab position of the grab bucket is obtained, anti-shaking control can be performed on the grab bucket, or anti-collision control can be performed by combining other environmental information.
The grab bucket position detection mode in the prior art mainly comprises: mode (1): a position detection sensor such as a GPS sensor is arranged on the grab bucket; mode (2): the light-emitting light source or the light-reflecting strip is arranged on the grab bucket, the receiver is arranged on the main beam, and the position of the grab bucket is detected by identifying the position of the light source or the light-reflecting strip.
However, in the method (1), the positioning signal in the cabin is easily blocked, the position of the grab bucket is difficult to accurately detect, the position detection sensor needs to be powered by an external battery, the grab bucket needs to be correspondingly modified, and the power supply needs to be replaced regularly. In mode (2), light source emitter/reflection of light strip is easy to be contaminated and sheltered from to influence the testing result, also need reform transform so that the power supply to the grab bucket simultaneously, and need regularly change the power, it is great to implement the degree of difficulty in practical application.
Disclosure of Invention
The embodiment of the application provides a method, a device, a system, a medium and a program product for detecting a grab bucket of a ship unloader.
In a first aspect, an embodiment of the present invention provides a method for detecting a grab bucket of a ship unloader, including:
acquiring a first point cloud picture of a cabin containing materials when the grab bucket is not in the process of material grabbing operation;
acquiring a second point cloud picture of the cabin when the grab bucket is in the process of material grabbing operation;
determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture;
and detecting the position of the grab bucket in the grabbing operation process based on the third point cloud picture.
Therefore, different from the method of detecting the position of the grab bucket by a position detection sensor or a reflective strip and the like, the embodiment of the invention provides the technical scheme of detecting the grab bucket based on a point cloud mechanism, overcomes the defect that a positioning signal is easy to be shielded, can accurately detect the position of the grab bucket, does not need to transform the grab bucket, and also reduces the implementation difficulty.
In one embodiment, obtaining a first cloud of material in a hold of material not in the process of grab handling comprises: scanning the cabin of the grab bucket, which is not in the process of material grabbing operation, by using a multi-line laser radar arranged on an arm support pitching mechanism of the ship unloader so as to obtain the first point cloud picture;
the second point cloud picture of the cabin in the process that the grab bucket is in the grabbing operation comprises the following steps: and scanning the cabin of the grab bucket in the material grabbing process by using the multi-line laser radar to obtain the second point cloud picture.
Therefore, the first point cloud picture and the second point cloud picture can be conveniently obtained through the multi-line laser radar arranged on the arm frame pitching mechanism.
In one embodiment, the detecting the position of the grab bucket during the grab operation based on the third cloud point comprises:
generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket;
determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model;
and determining the position of the grab bucket in the material grabbing process based on the positioning result.
Therefore, the point cloud model is generated by using the three-dimensional model of the grab bucket, and the grab bucket is positioned in the third point cloud picture by using the template matching algorithm, so that the position of the grab bucket in the material grabbing operation process can be accurately positioned.
In one embodiment, the detecting the position of the grab bucket during the grab operation based on the third cloud point comprises:
scanning the grab bucket by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader at multiple angles to generate a point cloud model of the grab bucket;
determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model;
and determining the position of the grab bucket in the material grabbing process based on the positioning result.
Therefore, the embodiment of the invention utilizes the multi-line laser radar scanning to generate the point cloud model, and utilizes the template matching algorithm to position the grab bucket in the third point cloud picture, so that the position of the grab bucket in the material grabbing operation process can be accurately positioned.
In one embodiment, the detecting the position of the grab bucket during the grab operation based on the third cloud point comprises:
inputting a marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning the grab bucket from the point cloud picture;
inputting the third point cloud picture into the grab bucket positioning model to determine the position of the grab bucket in the grab material operation process based on the positioning result of the grab bucket in the third point cloud picture.
Therefore, the grab bucket positioning model is trained by deep learning, and the position of the grab bucket in the grab material operation process can be accurately positioned.
In one embodiment, the detecting the position of the grab bucket during the grab operation based on the third cloud point comprises:
projecting the third point cloud picture into a two-dimensional image or a three-dimensional image;
positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image recognition mode;
and determining the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
Therefore, the embodiment of the invention can accurately position the position of the grab bucket in the material grabbing operation process by positioning the grab bucket in the two-dimensional image or the three-dimensional image projected by the third point cloud picture in the target image identification mode.
In a second aspect, an embodiment of the present invention provides a detection apparatus for a grab bucket of a ship unloader, including:
the first acquisition module is used for acquiring a first point cloud picture of a cabin containing materials when the grab bucket is not in the process of material grabbing operation;
the second acquisition module is used for acquiring a second point cloud picture of the cabin when the grab bucket is in the process of material grabbing operation;
the determining module is used for determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture;
and the detection module is used for detecting the position of the grab bucket in the grabbing operation process based on the third point cloud picture.
Therefore, different from the method of detecting the position of the grab bucket by a position detection sensor or a reflective strip and the like, the embodiment of the invention provides the technical scheme of detecting the grab bucket based on a point cloud mechanism, overcomes the defect that a positioning signal is easy to be shielded, can accurately detect the position of the grab bucket, does not need to transform the grab bucket, and also reduces the implementation difficulty.
In one embodiment, the first acquiring module is configured to scan the cabin of the grab bucket, which is not in the process of material grabbing operation, by using a multi-line laser radar arranged on a boom pitching mechanism of a ship unloader so as to acquire the first point cloud picture;
the second acquisition module is used for scanning the cabin of the grab bucket in the material grabbing operation process by using the multi-line laser radar so as to acquire the second point cloud picture.
Therefore, the first point cloud picture and the second point cloud picture can be conveniently obtained through the multi-line laser radar arranged on the arm frame pitching mechanism.
In one embodiment, the detection module is configured to perform at least one of:
generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket; determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result;
scanning the grab bucket by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader at multiple angles to generate a point cloud model of the grab bucket; determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result;
inputting a marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning the grab bucket from the point cloud picture; inputting the third point cloud picture into the grab bucket positioning model to determine the position of the grab bucket in the grab material operation process based on the positioning result of the grab bucket in the third point cloud picture;
projecting the third point cloud picture into a two-dimensional image or a three-dimensional image; positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image recognition mode; and determining the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
Therefore, the embodiment of the invention realizes the accurate positioning of the grab bucket in various ways.
In a third aspect, an embodiment of the present invention provides a detection system for a grab bucket of a ship unloader, including:
the multi-line laser radar is arranged on an arm support pitching mechanism of the ship unloader and used for scanning a cabin containing materials when the grab bucket is not in a material grabbing operation process so as to obtain a first point cloud picture and scanning the cabin when the grab bucket is in the material grabbing operation process so as to obtain a second point cloud picture;
the control module is arranged in a cart of the ship unloader and used for determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture; and detecting the position of the grab bucket in the grabbing operation process based on the third point cloud picture.
Therefore, different from the method of detecting the position of the grab bucket by a position detection sensor or a reflective strip and the like, the embodiment of the invention provides the technical scheme of detecting the grab bucket based on a point cloud mechanism, overcomes the defect that a positioning signal is easy to be shielded, can accurately detect the position of the grab bucket, does not need to transform the grab bucket, and also reduces the implementation difficulty.
In one embodiment, the multiline lidar is fixedly disposed outside a range of movement of a cart of the ship unloader with a field of view directed toward the hold.
Therefore, by fixedly arranging the multiline lidar outside the trolley movement range of the ship unloader, no interference is caused to the trolley movement.
In a fourth aspect, an embodiment of the present invention provides a detection apparatus for a grab bucket of a ship unloader, including:
at least one memory configured to store computer readable code;
at least one processor configured to invoke the computer readable code to perform the steps in the method of detecting a grab of a ship unloader as recited in any one of the above.
In a fifth aspect, the embodiments of the present invention provide a computer readable medium, having stored thereon computer readable instructions, which, when executed by a processor, cause the processor to execute the steps in the method for detecting a grab bucket of a ship unloader as described in any one of the above.
In a sixth aspect, embodiments of the invention provide a computer program product tangibly stored on a computer-readable medium and comprising computer-readable instructions that, when executed, cause at least one processor to perform the steps in the method of detecting a grab of a ship unloader as recited in any one of the above.
Drawings
Fig. 1 is an exemplary flowchart of a method for detecting a grab bucket of a ship unloader according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of a background point cloud according to an embodiment of the present invention.
FIG. 3 is a schematic diagram of a real-time point cloud according to an embodiment of the invention.
FIG. 4 is a schematic diagram of subtracting a background point cloud from a real-time point cloud according to an embodiment of the invention.
Fig. 5 is a schematic diagram of a grab detection according to an embodiment of the present invention.
Fig. 6 is a schematic view of a process of detecting a grab bucket of a ship unloader according to an embodiment of the present invention.
Fig. 7 is a structural view of a detection device for a grab bucket of a ship unloader according to an embodiment of the present invention.
Fig. 8 is a block diagram of a detection device of a grab bucket of a ship unloader having a memory-processor architecture according to an embodiment of the present invention.
Wherein the reference numbers are as follows:
Figure BDA0003573530290000051
Figure BDA0003573530290000061
Detailed Description
The subject matter described herein will now be discussed with reference to example embodiments. It should be understood that these embodiments are discussed only to enable those skilled in the art to better understand and thereby implement the subject matter described herein, and are not intended to limit the scope, applicability, or examples set forth in the claims. Changes may be made in the function and arrangement of elements discussed without departing from the scope of the embodiments of the disclosure. Various examples may omit, substitute, or add various procedures or components as needed. For example, the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. In addition, features described with respect to some examples may also be combined in other examples.
As used herein, the term "include" and its variants mean open-ended terms in the sense of "including, but not limited to. The term "based on" means "based at least in part on". The terms "one embodiment" and "an embodiment" mean "at least one embodiment". The term "another embodiment" means "at least one other embodiment". The terms "first," "second," and the like may refer to different or the same object. Other definitions, whether explicit or implicit, may be included below. The definition of a term is consistent throughout the specification unless the context clearly dictates otherwise.
In consideration of the defects of the prior art that the position of the grab bucket is detected by a position detection sensor or a reflective strip and the like, the embodiment of the invention provides a technical scheme for detecting the grab bucket based on a point cloud mechanism, so that the position of the grab bucket can be accurately detected, the grab bucket does not need to be modified, and the implementation difficulty is reduced.
Fig. 1 is an exemplary flowchart of a method for detecting a grab bucket of a ship unloader according to an embodiment of the present invention.
As shown in fig. 1, the method 100 includes:
step 101: and acquiring a first point cloud picture of the cabin containing the materials when the grab bucket is not in the process of grabbing the materials.
A point cloud refers to a collection of a large number of points of a target surface property. In step 101, a first cloud picture of a cabin containing materials, in which the grab bucket is not in the process of grabbing materials, is obtained according to a laser measurement principle, a photogrammetry principle or a combination of the laser measurement principle and the photogrammetry principle. Wherein: point cloud obtained according to the laser measurement principle comprises three-dimensional coordinates of a background environment including a cabin and laser reflection Intensity (Intensity); point clouds obtained according to photogrammetry principles, including background environment and color information (RGB) including a cabin; the point cloud obtained according to the combination mode of the laser measurement principle and the photogrammetry principle comprises three-dimensional coordinates of a background environment including a cabin, laser reflection intensity and color information. Preferably, the format of the first point cloud picture may include: *. pts; *. asc; *. dat; *. stl; *. imw; *. xyz; and so on.
Here, the first cloud contains the hold containing the material. In addition, in the generation process of the first point cloud picture, the grab bucket is not in the material grabbing operation process, so the first point cloud picture does not comprise the grab bucket. During the material grabbing operation, the cargo ship is usually in a static state, and the first cloud picture can be regarded as the background during the material grabbing operation.
Step 102: and acquiring a second point cloud picture of the cabin of the grab bucket in the process of material grabbing operation.
Similarly, in step 102, a second cloud point of the vessel cabin of the grab bucket during the grab operation may be obtained by a laser measurement principle, a photogrammetry principle, or a combination of the laser measurement principle and the photogrammetry principle. Accordingly, the format of the second point cloud graph may include: *. pts; *. asc; *. dat; *. stl; *. imw, respectively; *. xyz; and so on.
Here, the second point cloud contains the hold containing the material. In addition, in the generation process of the second point cloud picture, the grab bucket is in the grabbing operation process of grabbing materials in the cabin, so the second point cloud picture also comprises the grab bucket.
Step 103: and determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture.
And the difference between the second point cloud picture and the first point cloud picture is a third point cloud picture obtained by removing the first point cloud picture as the background from the second point cloud picture. The third cloud contains the grab and possibly disturbance data. For example, the disturbance data may be generated by the movement of the hold caused by the rippling of seawater.
Step 104: and detecting the position of the grab bucket in the material grabbing process based on the third cloud picture.
During the material grabbing operation, the position of the grab bucket may change. Preferably, a second cloud of points of the hold of the grab during the grab operation is acquired in real time in step 102. At this time, the second point cloud images are a series of point cloud images based on sampling time variation, that is, the number of the second point cloud images is multiple. Accordingly, the number of the third point clouds in the step 103 is multiple, so that the corresponding positions of the grab bucket in the grabbing operation process can be detected in real time based on the multiple third point clouds in the step 104.
In one embodiment, the step 101 of obtaining a first cloud of points of a hold containing material for which the grapple is not in the process of a grapple operation comprises: scanning a cabin of a grab bucket, which is not in the process of material grabbing operation, by using a multi-line laser radar arranged on an arm support pitching mechanism of the ship unloader so as to obtain a first point cloud picture; the step 102 of obtaining a second point cloud picture of the cabin of the grab bucket in the process of grabbing materials comprises the following steps: and scanning the cabin of the grab bucket in the material grabbing process by using the multi-line laser radar to obtain the second point cloud picture.
The multiline laser radar comprises (1) a laser emitting array, a laser receiving array and a laser processing array, wherein the laser emitting array is used for emitting multipath laser; (2) the laser receiving array is used for receiving multi-path laser echoes reflected by the target object; (3) the echo sampling device is used for sampling the multi-path laser echo in a time division multiplexing mode and outputting a sampling data stream; (4) and the control system is respectively connected with the laser transmitting array, the laser receiving array and the echo sampling device, and is used for controlling the work of the laser transmitting array and the laser receiving array and determining the measurement data according to the sampling data flow. The multiline lidar may further include: (5) and the output device is used for outputting the measurement data. The echo sampling device performs real-time processing through the control system after sampling in a time division multiplexing mode, and the real-time performance of the measurement process is improved.
The multiline lidar may be embodied as a 4-line, 8-line, 16-line, 32-line, 64-line or 128-line lidar, or the like. The multiline lidar can identify height information of an object and acquire a three-dimensional scanned image of the surrounding environment. A four-line lidar will be described as an example. The 4-line laser radar polls 4 laser transmitters, and after a polling period, a frame of laser point cloud data is obtained, and the four pieces of point cloud data can form planar information, so that the height information of the cabin can be obtained.
The embodiments of the present invention are described above by taking specific arrangement positions and specific line numbers of the multiline lidar as examples, and those skilled in the art can appreciate that such descriptions are only exemplary and are not intended to limit the scope of the embodiments of the present invention.
FIG. 2 is a schematic diagram of a background point cloud according to an embodiment of the present invention. In fig. 2, the multi-line lidar scans the vessel that is not in the process of grabbing the material, resulting in a background cloud 50 (i.e., the first cloud). It can be seen that the background cloud 50 does not include a grapple.
FIG. 3 is a schematic diagram of a real-time point cloud according to an embodiment of the invention. And scanning the cabin of the grab bucket in the material grabbing process in real time by the multi-line laser radar to obtain a second point cloud picture. The second point cloud includes a background cloud 50 and a grapple cloud 60. Wherein the background cloud 50 of fig. 3 may differ from the background cloud 50 of fig. 2 due to possible disturbances (such as sea water rippling causing the cabin to move).
FIG. 4 is a schematic diagram of subtracting a background point cloud from a real-time point cloud according to an embodiment of the invention. And subtracting the background point cloud from the real-time point cloud to obtain a third point cloud picture. The third point cloud contains the grapple cloud 60 and the disturbance cloud 61 due to the presence of disturbance.
Various inspection processes may be performed on the third cloud comprising the grab cloud 60 and the disturbance cloud 61 to determine the position of the grab during the grab operation.
In one embodiment, the step 104 of detecting the position of the grab bucket during the grab operation based on the third cloud point comprises: generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket; determining a positioning result of the grab bucket in a third point cloud picture according to a template matching algorithm based on the point cloud model; and determining the position of the grab bucket in the material grabbing process based on the positioning result.
For example, a base can be obtainedIn that
Figure BDA0003573530290000081
Or
Figure BDA0003573530290000082
And the three-dimensional drawing software establishes a three-dimensional model for the grab bucket. And converting the three-dimensional model into a point cloud model of the grab bucket. And then, determining a positioning result of the grab bucket in the third point cloud picture by utilizing a template matching algorithm, and determining the real position of the grab bucket in the material grabbing operation process based on the positioning result of the grab bucket in the third point cloud picture. When the multi-line laser radar scans in real time, the second point cloud chart can be a series of point cloud charts based on sampling time change in the material grabbing operation process, namely the number of the second point cloud charts is multiple. Correspondingly, the number of the third point cloud pictures is multiple, so that each real-time position of the grab bucket in the grabbing operation process can be detected in real time based on the multiple third point cloud pictures.
The specific process of template matching is explained below. Template matching is one of the important components of digital image processing. The template matching is a processing method of aligning two or more images acquired by different sensors or the same sensor under different time and different imaging conditions on the same scene in space or searching a corresponding pattern in another image according to a known pattern. In a simple way, the template is a known small image (here, a point cloud model). Template matching is to search for a target (point cloud model) in a large image (a third point cloud image). Knowing that the large image has a target to be found and the target has the same size, direction and image as the template, the target can be found in the image through a certain algorithm to determine the coordinate position of the target. Therefore, after the position of the point cloud model in the third point cloud picture is determined based on template matching, the position of the grab bucket in the real world can be determined based on coordinate conversion between the coordinate system of the third point cloud picture and the coordinate system of the real world.
Therefore, the point cloud model is generated by using the three-dimensional model of the grab bucket, the grab bucket is positioned in the third point cloud picture by using the template matching algorithm, and the position of the grab bucket in the material grabbing operation process can be accurately positioned.
In one embodiment, the step 104 of detecting the position of the grab bucket during the grab operation based on the third cloud point comprises: scanning the grab bucket in a multi-angle mode by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader to generate a point cloud model of the grab bucket; determining a positioning result of the grab bucket in a third point cloud picture according to a template matching algorithm based on the point cloud model; and determining the position of the grab bucket in the process of material grabbing operation based on the positioning result. The multiline lidar can also be arranged at other arrangement positions suitable for the multi-angle scanning grab bucket, and the embodiment of the invention is not limited to this.
Therefore, the embodiment of the invention utilizes the multi-line laser radar scanning to generate the point cloud model, and utilizes the template matching algorithm to position the grab bucket in the third point cloud picture, so that the position of the grab bucket in the material grabbing operation process can be accurately positioned.
In one embodiment, the step 104 of detecting the position of the grab bucket during the grab operation based on the third cloud point comprises: inputting the marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning a grab bucket from the point cloud picture; and inputting the third point cloud picture into the grab bucket positioning model so as to determine the position of the grab bucket in the grabbing operation process based on the positioning result of the grab bucket in the third point cloud picture.
Therefore, the grab bucket positioning model is trained by deep learning, and the position of the grab bucket in the material grabbing operation process can be accurately positioned.
In one embodiment, the step 104 of detecting the position of the grab bucket during the grab operation based on the third cloud point comprises: projecting the third point cloud picture into a two-dimensional image or a three-dimensional image; positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image identification mode; and determining the position of the grab bucket in the process of material grabbing operation based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
Therefore, the embodiment of the invention can accurately position the position of the grab bucket in the material grabbing operation process by positioning the grab bucket in the two-dimensional image or the three-dimensional image projected by the third point cloud picture in the target image identification mode.
Fig. 5 is a schematic diagram of a grab detection according to an embodiment of the present invention. It can be seen that after the template matching algorithm, the grapple cloud 60 can be located from the third point cloud.
After the real-time position of the grab bucket is obtained, the grab bucket can be subjected to anti-shaking control or anti-collision control in combination with other environment information.
Therefore, the laser radar commonly used in the robot/automatic driving field is introduced into the crane field, the position of the grab bucket can be detected in real time, and then the position can be used for anti-swing or anti-collision control of the grab bucket. Compared with the prior art, the structure is simple and is not easy to be interfered. In addition, the multi-line laser radar can be simultaneously applied to other detections.
The embodiment of the invention also provides a grab bucket system of the ship unloader. This ship unloaders grab bucket system includes: the multi-line laser radar is arranged on an arm frame pitching mechanism of the ship unloader and used for scanning a cabin containing materials when the grab bucket is not in the material grabbing operation process so as to obtain a first point cloud picture, and scanning the cabin when the grab bucket is in the material grabbing operation process so as to obtain a second point cloud picture; the control module is arranged in a cart of the ship unloader and used for determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture; and detecting the position of the grab bucket in the material grabbing process based on the third cloud picture.
In one embodiment, the multiline lidar is fixedly disposed outside the range of motion of the cart of the ship unloader with the field of view directed toward the hold.
Fig. 6 is a schematic view of a process of detecting a grab bucket of a ship unloader according to an embodiment of the present invention.
In fig. 6, the ship unloader 10 is used for ship unloading operation of a cargo ship 22 moored in a quay. The material 21 in the cargo ship 22 is stored in the storage space of the cargo hold 20. The ship unloader 10 includes a ship unloader grapple 11, a dolly 13, a boom pitch mechanism 14, a cab 17, and a cart 18. The cart 18 contains an electrical room 15 and a dump box 16. The grab bucket 11 of the ship unloader is connected to a trolley 13 via a wire rope 12. Both the trolley 13 and the cab 17 can slide on the boom pitch mechanism 14. The multi-line laser radar 30 is fixedly arranged on the arm support pitching mechanism 14. The fixed position of the multiline lidar 30 is outside the range of movement of the trolley 13 in the boom tilt mechanism 14. The field of view of the multiline lidar 30 is directed toward the cargo compartment 20.
When the grab bucket 11 is not in the process of grabbing materials (for example, when the grab bucket 11 is located above the unloading frame 16), the multiline laser radar 30 collects a first cloud image of the cargo hold 20 at the moment as a background. The multiline lidar 30 transmits the first cloud image to the control module 40 in the electrical room 15 via wired or wireless communication.
The multiline lidar 30 acquires a second cloud of points of the cargo compartment 20 in real time (correspondingly, the second cloud of points is a plurality of cloud of points associated with the acquisition time) while the grapple 11 is in the process of material grabbing (e.g., while the grapple 11 is moving inside the cargo compartment 20). The multiline lidar 30 transmits the second cloud image to the control module 40 in the electrical room 15 via wired or wireless communication. The control module 40 determines the difference between the second point cloud image and the first point cloud image as a third point cloud image (accordingly, the third point cloud images are multiple).
The control module 40 can detect the real-time position of the grab bucket 11 in the material grabbing process based on the third cloud picture. The specific mode can include:
(1) and the control module 40 generates a point cloud model of the grab bucket 11 based on the three-dimensional model of the grab bucket 11. The control module 40 determines the positioning result of the grab bucket 11 in the third point cloud image according to the template matching algorithm based on the point cloud model. The control module 40 determines the position of the grab bucket 11 during the grab operation based on the positioning result.
(2) The multi-line lidar 30 scans the grapple 11 from multiple angles to generate a point cloud model of the grapple 11. The control module 40 determines the positioning result of the grab bucket 11 in the third point cloud image according to the template matching algorithm based on the point cloud model. The control module 40 determines the position of the grapple 11 during the material grabbing operation based on the positioning result.
(3) The control module 40 inputs the marked clamshell point cloud picture as training data to the artificial neural network in advance to train the artificial neural network into a clamshell positioning model adapted to position the clamshell 11 from the point cloud picture. The control module 40 inputs the third point cloud image into the grab bucket positioning model to determine the position of the grab bucket 11 during the grab operation based on the positioning result of the grab bucket 11 in the third point cloud image.
(4) And the control module 40 projects the third point cloud picture into a two-dimensional image or a three-dimensional image. The control module 40 locates the grapple 11 from a two-dimensional image or a three-dimensional image in a target image recognition manner. The control module 40 determines the position of the grapple 11 during the material gripping operation based on the positioning result of the grapple 11 in the two-dimensional image or the three-dimensional image.
Fig. 7 is a structural view of a detection device for a grab bucket of a ship unloader according to an embodiment of the present invention.
The detecting device 700 of the grab of the ship unloader includes:
the first acquisition module 701 is used for acquiring a first point cloud picture of a cabin containing materials when the grab bucket is not in the process of material grabbing operation;
a second obtaining module 702, configured to obtain a second point cloud chart of the cabin when the grab bucket is in the process of material grabbing operation;
a determining module 703, configured to determine a difference between the second point cloud image and the first point cloud image as a third point cloud image;
and the detection module 704 is used for detecting the position of the grab bucket in the material grabbing process based on the third cloud picture.
In one embodiment, the first acquiring module 701 is configured to scan a cabin of a grab bucket, which is not in a material grabbing process, by using a multi-line laser radar arranged on a boom pitching mechanism of a ship unloader so as to acquire a first point cloud picture; and a second obtaining module 702, configured to scan the cabin of the grab bucket in the process of material grabbing operation by using the multi-line lidar, so as to obtain a second point cloud chart.
In one embodiment, the detecting module 704 is configured to perform at least one of the following: generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket; determining a positioning result of the grab bucket in a third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result; scanning the grab bucket in a multi-angle mode by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader to generate a point cloud model of the grab bucket; determining a positioning result of the grab bucket in a third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result; inputting the marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning a grab bucket from the point cloud picture; inputting the third point cloud picture into a grab bucket positioning model so as to determine the position of the grab bucket in the grabbing operation process based on the positioning result of the grab bucket in the third point cloud picture; projecting the third point cloud picture into a two-dimensional image or a three-dimensional image; positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image identification mode; and determining the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
Fig. 8 is a block diagram of a detection device of a grab bucket of a ship unloader having a memory-processor architecture according to an embodiment of the present invention. As shown in fig. 8, the detecting device 800 of the grab bucket of the ship unloader comprises a processor 802, a memory 801 and a computer program stored on the memory 801 and operable on the processor 802, wherein when the computer program is executed by the processor 802, the detecting method of the grab bucket of the ship unloader as described above is realized. The memory 801 may be embodied as various storage media such as an Electrically Erasable Programmable Read Only Memory (EEPROM), a Flash memory (Flash memory), and a Programmable Read Only Memory (PROM). The processor 802 may be implemented to include one or more central processors or one or more field programmable gate arrays integrated with one or more central processor cores. In particular, the central processor or central processor core may be implemented as a CPU or MCU or DSP, etc.
It should be noted that not all steps and modules in the above flows and structures are necessary, and some steps or modules may be omitted according to actual needs. The execution order of the steps is not fixed and can be adjusted as required. The division of each module is only for convenience of describing adopted functional division, and in actual implementation, one module may be divided into multiple modules, and the functions of multiple modules may also be implemented by the same module, and these modules may be located in the same device or in different devices. The hardware modules in the various embodiments may be implemented mechanically or electronically. For example, a hardware module may include a specially designed permanent circuit or logic device (e.g., a special purpose processor such as an FPGA or ASIC) for performing specific operations. A hardware module may also include programmable logic devices or circuits (e.g., including a general-purpose processor or other programmable processor) that are temporarily configured by software to perform certain operations. The implementation of the hardware module in a mechanical manner, or in a dedicated permanent circuit, or in a temporarily configured circuit (e.g., configured by software) may be determined by cost and time considerations.
Furthermore, the embodiment of the invention also provides a computer readable storage medium, which stores computer readable codes, and when the computer readable codes are executed by a processor, the processor is caused to execute the detection method of the grab bucket of the ship unloader. Additionally, embodiments of the present invention also provide a computer program product, tangibly stored on a computer-readable medium and comprising computer-readable instructions that, when executed, cause at least one processor to perform the method of detecting a grab bucket of a ship unloader of embodiments of the present invention.
It should be noted that not all steps and modules in the above flows and system structure diagrams are necessary, and some steps or modules may be omitted according to actual needs. The execution order of the steps is not fixed and can be adjusted as required. The system structure described in the above embodiments may be a physical structure or a logical structure, that is, some modules may be implemented by the same physical entity, or some modules may be implemented by a plurality of physical entities, or some components in a plurality of independent devices may be implemented together.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit and scope of the present invention.

Claims (14)

1. Method (100) for detecting a grab bucket of a ship unloader, characterized in that it comprises:
acquiring a first point cloud picture (101) of a cabin containing materials when a grab bucket is not in the process of material grabbing operation;
acquiring a second point cloud picture (102) of the cabin of the grab bucket in the process of the material grabbing operation;
determining the difference between the second point cloud picture and the first point cloud picture as a third point cloud picture (103);
detecting a position of the grab bucket during the grab operation based on the third point cloud picture (104).
2. The method (100) of inspection of a grab bucket of a ship unloader according to claim 1,
the method for acquiring the first point cloud picture (101) of the cabin containing the materials, which is not in the process of grabbing materials, of the grab bucket comprises the following steps: scanning the cabin of the grab bucket, which is not in the process of grabbing materials, by using a multi-line laser radar arranged on an arm support pitching mechanism of the ship unloader so as to obtain the first point cloud picture;
the second point cloud picture of the cabin in the process that the grab bucket is in the grabbing operation comprises the following steps: and scanning the cabin of the grab bucket in the material grabbing process by using the multi-line laser radar to obtain the second point cloud picture.
3. The method (100) of inspection of a grab bucket of a ship unloader according to claim 1,
the detecting a position (104) of the grab bucket during the grab operation based on the third point cloud picture comprises:
generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket;
determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model;
and determining the position of the grab bucket in the material grabbing process based on the positioning result.
4. The method (100) of inspection of a grab bucket of a ship unloader according to claim 1,
the detecting a position (104) of the grab bucket during the grab operation based on the third point cloud picture comprises:
scanning the grab bucket by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader at multiple angles to generate a point cloud model of the grab bucket;
determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model;
and determining the position of the grab bucket in the material grabbing process based on the positioning result.
5. The method (100) of inspection of a grab bucket of a ship unloader according to claim 1,
the detecting a position (104) of the grab bucket during the grab operation based on the third point cloud picture comprises:
inputting a marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning the grab bucket from the point cloud picture;
inputting the third point cloud picture into the grab bucket positioning model to determine the position of the grab bucket in the grab material operation process based on the positioning result of the grab bucket in the third point cloud picture.
6. The method (100) of inspection of a grab bucket of a ship unloader according to claim 1,
the detecting a position (104) of the grab bucket during the grab operation based on the third point cloud picture comprises:
projecting the third point cloud picture into a two-dimensional image or a three-dimensional image;
positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image recognition mode;
and determining the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
7. Detection apparatus (700) of ship unloaders grab bucket, its characterized in that includes:
the first acquisition module (701) is used for acquiring a first point cloud picture of a cabin containing materials when the grab bucket is not in the material grabbing operation process;
a second acquisition module (702) for acquiring a second point cloud picture of the cabin when the grab bucket is in the process of material grabbing operation;
a determining module (703) for determining a difference between the second point cloud picture and the first point cloud picture as a third point cloud picture;
a detecting module (704) for detecting the position of the grab bucket in the grabbing operation process based on the third point cloud picture.
8. The detection device (700) of a grab bucket of a ship unloader according to claim 7,
the first acquisition module (701) is used for scanning the cabin of the grab bucket, which is not in the process of material grabbing operation, by using a multi-line laser radar arranged on an arm frame pitching mechanism of a ship unloader so as to acquire the first point cloud picture;
the second obtaining module (702) is configured to scan the cabin of the grab bucket in a grabbing operation process by using the multi-line lidar to obtain the second point cloud chart.
9. The detection device (700) of a grab bucket of a ship unloader according to claim 7,
the detection module (704) is configured to perform at least one of:
generating a point cloud model of the grab bucket based on the three-dimensional model of the grab bucket; determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result;
scanning the grab bucket by using a multi-line laser radar arranged on an arm frame pitching mechanism of the ship unloader at multiple angles to generate a point cloud model of the grab bucket; determining a positioning result of the grab bucket in the third point cloud picture according to a template matching algorithm based on the point cloud model; determining the position of the grab bucket in the material grabbing process based on the positioning result;
inputting a marked grab bucket point cloud picture serving as training data into an artificial neural network so as to train the artificial neural network into a grab bucket positioning model which is suitable for positioning the grab bucket from the point cloud picture; inputting the third cloud point map into the grab bucket positioning model to determine the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the third cloud point map;
projecting the third point cloud picture into a two-dimensional image or a three-dimensional image; positioning the grab bucket from the two-dimensional image or the three-dimensional image in a target image recognition mode; and determining the position of the grab bucket in the material grabbing process based on the positioning result of the grab bucket in the two-dimensional image or the three-dimensional image.
10. Detection system of ship unloaders grab bucket, its characterized in that includes:
the multi-line laser radar (30) is arranged on an arm support pitching mechanism (14) of the ship unloader (10) and used for scanning a cabin (20) containing materials (21) when the grab bucket (11) is not in the material grabbing operation process to obtain a first point cloud picture, and scanning the cabin (20) when the grab bucket (11) is in the material grabbing operation process to obtain a second point cloud picture;
a control module (40) arranged in a cart (18) of a ship unloader (10) for determining a difference between the second cloud point map and the first cloud point map as a third cloud point map; and detecting the position of the grab bucket (11) in the grabbing operation process based on the third cloud picture.
11. The detection system of the grab of the ship unloader as recited in claim 10,
the multiline lidar (30) is fixedly arranged outside the movement range of a trolley (13) of the ship unloader (10), and the view range faces the cabin (20).
12. Detection apparatus (800) of ship unloaders grab bucket, its characterized in that includes:
at least one memory (801) configured to store computer readable code;
at least one processor (802) configured to invoke the computer readable code to perform the steps in the method (100) of detecting a grab of a ship unloader as claimed in any one of claims 1 to 6.
13. A computer readable medium, characterized in that it has stored thereon computer readable instructions which, when executed by a processor, cause the processor to carry out the steps in the method (100) of detection of a grab of a ship unloader as claimed in any one of claims 1 to 6.
14. A computer program product, characterized in that it is tangibly stored on a computer-readable medium and comprises computer-readable instructions that, when executed, cause at least one processor to perform the steps in the method (100) of detecting a grab of a ship unloader as claimed in any one of claims 1 to 6.
CN202210326186.4A 2022-03-30 2022-03-30 Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader Pending CN114648581A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210326186.4A CN114648581A (en) 2022-03-30 2022-03-30 Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210326186.4A CN114648581A (en) 2022-03-30 2022-03-30 Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader

Publications (1)

Publication Number Publication Date
CN114648581A true CN114648581A (en) 2022-06-21

Family

ID=81994829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210326186.4A Pending CN114648581A (en) 2022-03-30 2022-03-30 Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader

Country Status (1)

Country Link
CN (1) CN114648581A (en)

Similar Documents

Publication Publication Date Title
US10451405B2 (en) Dimensioning system for, and method of, dimensioning freight in motion along an unconstrained path in a venue
EP2386516B1 (en) Trolley assembly for a crane and crane comprising said trolley assembly.
JP3805302B2 (en) Work take-out device
US10776972B2 (en) Systems and methods for stitching sequential images of an object
CN112528721B (en) Bridge crane integrated card safety positioning method and system
AU2013326359A1 (en) Load handling by load handling device
CN110182620B (en) Scanning identification system of unmanned chain bucket type continuous ship unloader and working method
KR20110086823A (en) Container position measuring method and container position measuring device
CN111814936A (en) Container identification method, system, equipment and storage medium based on space scanning
CN108946487A (en) Container crane remote operating system
US20230348237A1 (en) Mapping of a Crane Spreader and a Crane Spreader Target
CN208802612U (en) Ship loader operating system and ship loader
CN114648581A (en) Method, apparatus, system, medium, and program product for detecting grab bucket of ship unloader
CN111526352B (en) Railway foreign matter anti-invasion three-dimensional intelligent recognition robot equipment
CN209871804U (en) Scanning identification system of unmanned chain bucket type continuous ship unloader
Jung et al. Advanced sensing system of crane spreader motion (for mobile harbor)
AU2017274080B2 (en) Three dimensional object mapping
CN115930791A (en) Multi-mode data container cargo position and size detection method
JPH09255158A (en) Article disposition recognizing device
CN205367195U (en) Unmanned on duty bridge type ship unloaders
CN105883440B (en) Unattended bridge type ship unloader
CN114800494A (en) Box moving manipulator based on monocular vision
KR20110066764A (en) Spreader control system of crane for container
EP4091139A1 (en) Inspection device for inspecting a building or structure
JPH09257414A (en) Object position detector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination