CN113435524A - Intelligent stacker and method, device and equipment for identifying position abnormality of tray - Google Patents

Intelligent stacker and method, device and equipment for identifying position abnormality of tray Download PDF

Info

Publication number
CN113435524A
CN113435524A CN202110742463.5A CN202110742463A CN113435524A CN 113435524 A CN113435524 A CN 113435524A CN 202110742463 A CN202110742463 A CN 202110742463A CN 113435524 A CN113435524 A CN 113435524A
Authority
CN
China
Prior art keywords
tray
point cloud
cloud data
stacker
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110742463.5A
Other languages
Chinese (zh)
Inventor
徐光运
孙文侠
张贻弓
沈长鹏
张小艺
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lanjian Intelligent Technology Co ltd
Original Assignee
Lanjian Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lanjian Intelligent Technology Co ltd filed Critical Lanjian Intelligent Technology Co ltd
Priority to CN202110742463.5A priority Critical patent/CN113435524A/en
Publication of CN113435524A publication Critical patent/CN113435524A/en
Priority to PCT/CN2021/121160 priority patent/WO2023272985A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • B65G43/08Control devices operated by article or material being fed, conveyed or discharged
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G57/00Stacking of articles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2201/00Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
    • B65G2201/02Articles
    • B65G2201/0235Containers
    • B65G2201/0258Trays, totes or bins
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/041Camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an intelligent stacker and a method, a device and equipment for identifying position abnormality of a tray, wherein the method comprises the following steps: collecting point cloud data of the tray; processing the point cloud data to obtain a tray position; judging whether the tray is abnormal or not, wherein the abnormal condition of the tray comprises the following steps: tray deflection, hole collapse, and beam subsidence. The invention utilizes the 3D vision technology to detect the position and the abnormality of the goods tray to be taken, avoids the occurrence of goods collision of the stacker when the tray has abnormality such as deviation, collapse and the like, and increases the safety in the goods taking process.

Description

Intelligent stacker and method, device and equipment for identifying position abnormality of tray
Technical Field
The invention relates to an intelligent stacker, and a method, a device and equipment for identifying position abnormality of a tray, and belongs to the technical field of warehouse logistics.
Background
Currently, the mainstream stacker positioning technology is to position a stacker by an encoder, laser ranging, and the like, move the stacker to a goods taking position at a specific coordinate position, and take goods.
However, in the process of taking goods by the stacker, the detection of the pallet is not available, and the stacker only needs to directly fork and take goods after reaching the target position. And factors such as the quality of goods on the tray, the placement error of the stacker and the like easily cause the tray on the goods shelf to deviate, deform and collapse, and the risk of goods collision easily occurs when the stacker takes goods, so that the loss which can not be estimated is generated for users.
In order to avoid the risk of goods collision, the invention develops a technology for detecting the position of the tray so as to avoid the occurrence of goods collision.
Therefore, a method for detecting the position of the tray with high efficiency is required while reducing the cost.
Disclosure of Invention
In order to solve the problems, the invention provides an intelligent stacker and a method, a device and equipment for identifying the position abnormality of a tray, which can avoid the occurrence of goods collision.
The technical scheme adopted for solving the technical problems is as follows:
in a first aspect, an embodiment of the present invention provides a method for identifying a tray position abnormality, including the following steps:
collecting point cloud data of the tray;
processing point cloud data;
utilizing a 3D point cloud edge detection algorithm to judge the tray state, wherein the tray state comprises the following steps: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
As a possible implementation manner of this embodiment, the processing point cloud data includes:
correcting the inclination of the collected point cloud data according to the calibration data;
converting the tray front end point cloud into a 2D image;
and carrying out regional edge detection on the 2D image by using a 3D point cloud edge detection algorithm, and positioning the boundary of the tray.
As a possible implementation manner of this embodiment, the determining the tray state by using a 3D point cloud edge detection algorithm includes:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
As one possible implementation of the present embodiment,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
In a second aspect, an apparatus for detecting a tray position according to an embodiment of the present invention includes:
the point cloud data acquisition module is used for acquiring point cloud data of the tray;
the point cloud data processing module is used for processing the point cloud data;
the tray abnormity judging module is used for judging the tray state by utilizing a 3D point cloud edge detection algorithm, and the tray state comprises: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
As a possible implementation manner of this embodiment, the point cloud data processing module includes:
the point cloud data correction module is used for correcting the inclination of the collected point cloud data according to the calibration data;
the image conversion module is used for converting the cloud of the front end point of the tray into a 2D image;
and the tray boundary positioning module is used for carrying out regional edge detection on the 2D image and positioning the boundary of the tray.
As a possible implementation manner of this embodiment, the tray abnormality determining module is specifically configured to:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
As one possible implementation of the present embodiment,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
In a third aspect, the embodiment of the invention provides an intelligent stacker, which comprises a stacker body, a 3D camera and an industrial personal computer, wherein the 3D camera photographs a tray and collects point cloud data of the tray; the industrial personal computer is provided with a computer program which executes the steps of the method for identifying the position abnormality of any tray when the computer program runs.
As a possible implementation manner of this embodiment, the 3D camera is installed on the base and/or the side support of the stacker. And the 3D camera is arranged on a base of the stacker or a side bracket, so that the requirement of the 3D camera for collecting the data of the tray is met.
As a possible implementation manner of this embodiment, before the step of performing the above-mentioned method for identifying an abnormality in any tray position, after placing a calibration plate at a standard position of the tray, the calibration plate is identified by using the reliability image information, a reference pose of the camera is obtained, and a pose relationship between the image capture device and the pose of the standard tray is obtained as calibration data.
In a fourth aspect, a computer device according to an embodiment of the present invention is characterized by including a processor, a memory, and a bus, where the memory stores machine-readable instructions executable by the processor, and when the computer device runs, the processor and the memory communicate with each other through the bus, and the processor executes the machine-readable instructions to perform the steps of the above method for identifying an abnormality in any tray position.
In a fifth aspect, an embodiment of the present invention provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the above method for identifying an abnormality in any tray position.
The technical scheme of the embodiment of the invention has the following beneficial effects:
according to the technology of the visual detection tray, the visual detection tray is utilized before the stacker picks up the goods, whether the position, the deformation and the like of the visual detection tray are normal is detected, and the goods are picked up after judgment, so that the goods collision is avoided, and the correction or alarm operation can be performed on the abnormal tray.
The invention utilizes the 3D vision technology to detect the position and the abnormality of the goods tray to be taken, avoids the occurrence of goods collision of the stacker when the tray has abnormality such as deviation, collapse and the like, and increases the safety in the goods taking process.
In the goods taking process of the stacker, the 3D vision technology is used for detecting the position and the abnormality of the goods tray to be taken, so that the stacker is prevented from colliding when the tray is deviated, collapsed and the like.
Description of the drawings:
FIG. 1 is a flow chart illustrating a method of identifying a tray position anomaly in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a tray position anomaly in accordance with an exemplary embodiment;
FIG. 3 is a flow diagram illustrating a 3D point cloud edge detection algorithm in accordance with an exemplary embodiment;
FIG. 4 is a block diagram illustrating an apparatus for detecting a position of a tray in accordance with one exemplary embodiment;
FIG. 5 is a schematic diagram illustrating a 3D camera mounting location according to an exemplary embodiment;
FIG. 6 is a schematic diagram illustrating a 3D camera mounted on a stacker according to an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating a dual camera calibration pose transformation in accordance with an exemplary embodiment;
FIG. 8 is a block diagram illustrating a computer device in accordance with an exemplary embodiment.
Detailed Description
The invention is further illustrated by the following examples in conjunction with the accompanying drawings:
in order to clearly explain the technical features of the present invention, the following detailed description of the present invention is provided with reference to the accompanying drawings. The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. It should be noted that the components illustrated in the figures are not necessarily drawn to scale. Descriptions of well-known components and processing techniques and procedures are omitted so as to not unnecessarily limit the invention.
As shown in fig. 1, an embodiment of the present invention provides a method for identifying a tray position abnormality, including the following steps:
collecting point cloud data of the tray;
processing point cloud data;
utilizing a 3D point cloud edge detection algorithm to judge the tray state, wherein the tray state comprises the following steps: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
As a possible implementation manner of this embodiment, the acquiring point cloud data of the tray includes:
after the calibration plate is placed at the standard position of the tray, the calibration plate is identified by using the reliability image information, the reference pose of the camera is obtained, and the pose relation between the image acquisition device and the pose of the standard tray is obtained.
As a possible implementation manner of this embodiment, the processing point cloud data includes:
correcting the inclination of the collected point cloud data according to the calibration data;
converting the tray front end point cloud into a 2D image;
and carrying out regional edge detection on the 2D image by using a 3D point cloud edge detection algorithm, and positioning the boundary of the tray.
As a possible implementation manner of this embodiment, the determining the tray state by using a 3D point cloud edge detection algorithm includes:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
As a possible implementation of this embodiment, as shown in figure 2,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
As shown in fig. 3, the present invention employs a 3D point cloud edge detection algorithm. According to the scheme, after an industrial 3D camera is used for shooting a tray to obtain point cloud information of the tray, the inclination of the point cloud is corrected according to calibration data, the front end point cloud of the tray is converted into a 2D image, the boundary of the tray is located in the 2D image by utilizing an area edge searching technology, the result value is compared with original template position data, and data such as relative left-right translation, up-down settlement and the like of the tray are obtained.
The 2D image area edge searching technology has the idea that detection lines are arranged in a certain ROI in the same direction at equal intervals to detect edge gradient extremum, namely edge positions are detected.
The region edge detection technique includes:
line iterator for extracting gray information from a specific line position of image
Edge point detection algorithm, which detects the positions of maximum amplification and maximum reduction points by using gray data on a straight line and considers the positions as edge positions
The detection line layout algorithm sets the detection range, sets the detection line spacing or total amount, and arranges the detection lines in the ROI at equal intervals in the same direction
And (3) fitting a straight line, namely obtaining edge gradient extremum detected by the detection lines arranged in the same direction at equal intervals, namely detecting edge positions, and fitting the positions to obtain the straight line, namely considering the straight line edge positions as practical.
According to the invention, the physical edges of the tray are stably detected through a 3D point cloud edge detection algorithm, the edge deformation quantity is judged, the deviation and deformation quantity of the tray are determined, and the tray states such as tray existence, tray deviation, hole collapse, tray slider rotation, beam settlement, beam inclination and the like can be obtained.
As shown in fig. 4, an apparatus for detecting a tray position according to an embodiment of the present invention includes:
the point cloud data acquisition module is used for acquiring point cloud data of the tray;
the point cloud data processing module is used for processing the point cloud data;
the tray abnormity judging module is used for judging the tray state by utilizing a 3D point cloud edge detection algorithm, and the tray state comprises: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
As a possible implementation manner of this embodiment, the point cloud data processing module includes:
the point cloud data correction module is used for correcting the inclination of the collected point cloud data according to the calibration data;
the image conversion module is used for converting the cloud of the front end point of the tray into a 2D image;
and the tray boundary positioning module is used for carrying out regional edge detection on the 2D image and positioning the boundary of the tray.
As a possible implementation manner of this embodiment, the tray abnormality determining module is specifically configured to:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
As one possible implementation of the present embodiment,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
In a third aspect, the embodiment of the invention provides an intelligent stacker, which comprises a stacker body, a 3D camera and an industrial personal computer, wherein the 3D camera photographs a tray and collects point cloud data of the tray; the industrial personal computer is provided with a computer program which executes the steps of the method for identifying the position abnormality of any tray when the computer program runs.
As a possible implementation manner of this embodiment, the 3D camera is installed on the base and/or the side support of the stacker. As shown in fig. 5 and 6, the 3D camera is mounted on the base or the side support of the stacker, so as to meet the requirement of the 3D camera for collecting the tray data.
As a possible implementation manner of this embodiment, before the step of performing the above-mentioned method for identifying an abnormality in any tray position, after placing a calibration plate at a standard position of the tray, the calibration plate is identified by using the reliability image information, a reference pose of the camera is obtained, and a pose relationship between the image capture device and the pose of the standard tray is obtained as calibration data.
The method comprises the steps that 3D cameras are used for shooting, the number and the installation positions of the cameras are selected and matched according to the type of the stacker, and two cameras are selected for shooting a single-side tray of the single-depth stacker; for one-sided shooting of the double-depth stacker, as shown in fig. 5 and 6, two cameras (near-extension cameras) are selected for the near-extension tray to shoot, and one camera (far-extension camera) is selected for the far-extension tray to shoot. The camera is installed on carrying cargo bed, and industrial computer and switch are installed in carrying cargo bed regulator cubicle, all move along with carrying cargo bed.
Due to the fact that the space of the stacker is compact, the camera is greatly limited in position and view field on the stacker, the camera cannot shoot the tray in a front view mode, only can shoot part of the tray in an inclined mode, and the imaging distance is short. For the shooting of the pallet at the near-depth goods taking position, the scheme selects that two sides of the stacker are respectively shot towards the middle by using one camera, and one pile and one hole of the pallet are respectively shot; for the far-extending tray to take the picture of the tray, the scheme selects a camera installed in a gap between the two forks to shoot horizontally forwards. The mounting position and the camera angle of view are shown in fig. 5 and 6.
The invention adopts a calibration method of a plurality of Tof3D cameras, after a calibration plate is placed at a standard position of a tray, the calibration plate is identified by using confidence image information of the Tof3D cameras, a reference pose of the cameras is obtained, and further the pose relation of 2 3D cameras and the pose of a standard tray of a stacker is obtained. The calibration technology is used for acquiring the pose transformation between a plurality of Tof3D cameras and a stacker (standard goods picking position tray). The system is used for calibrating the camera position and the stacker after the camera is installed and providing calibration data for the subsequent point cloud information processing.
Because the two cameras have no common vision, the calibration precision is considered, and the calibration plate is manually (automatically) moved by the device), so that the calibration is realizedThe plates can be photographed separately under two cameras. When the calibration plate is moved manually, the calibration plate is moved for a fixed distance along one axial direction of a coordinate system of the calibration plate, so that the array for changing the position and the attitude of the calibration plate is calibrated twicecalHcal’There is only one translation.
As shown in fig. 7, for example, after calibrating the camera,camAHcalandcamBHcal’all can be obtained by calibrating and calculating external parameters, so that the pose of the camera B under the camera A has:camAHcamBcamAHcal*calHcal’*(camBHcal’)-1and then acquiring the pose transformation between the two Tof3D cameras and the pallet.
The process of detecting the position of the pallet by using the stacker is as follows:
1. the stacker moves to a target goods taking position, and the stacker master control PLC triggers the industrial personal computer to work by utilizing TCP communication
2. The industrial personal computer calls a camera to take a picture, and calls a 3D point cloud edge detection algorithm to detect core detection quantities such as offset, hole collapse, beam collapse and the like of the current tray
3. The industrial control computer system compares the core detection quantity of the current tray with the pre-stored core detection quantity of the standard tray, compares the deviation with a set threshold value, and determines the logic of goods taking, deviation rectifying and alarming
4. And the industrial personal computer returns the processing result to the stacker master control PLC, and the stacker executes corresponding operation according to the visual result.
The specific detection items and the detection method in the process of detecting the position of the tray comprise the following steps:
(1) presence or absence of a tray: due to manual operation and other reasons, some goods positions of the goods shelf of the stacker have no trays, and the stacker has empty taking phenomenon. The invention utilizes the result of algorithm detection tray edge to judge whether the goods space has a tray.
(2) Tray shifting: due to mechanical errors, pallet slip and the like when the stacker puts the goods, the finished pallet is not on the standard goods taking position and may have left and right deviation. The invention uses visual detection to detect the position of the sliding blocks at the two sides of the tray and judges the position of the whole tray.
(3) And (3) tray collapse: the impact of the pallet during handling, the long-term placement of heavier cargo on the pallet, can cause the pallet to collapse. The invention uses the visual detection tray hole edge position to calculate the collapse height and judges whether the tray collapses.
(4) Beam settlement and beam inclination: the mounting heights of the crossbeams of the cargo spaces are different, so that the height difference exists between the cargo carrying platform and the crossbeams; and the crossbeam probably has left right slope, causes the crossbeam height about inconsistent, gets to put the goods and causes the problem. The invention uses visual detection to detect the height of the cross beam under the left hole and the right hole, and judges whether the two heights are consistent.
(5) The tray slide block rotates: during the process of carrying the tray, the sliding blocks can rotate and shift due to some collisions, and the rotating and shifting bad sliding blocks can be collided when the stacker normally takes goods. The patent analyzes the point cloud of the area of the sliding block on the basis of positioning the tray, determines the angle of the sliding block by utilizing the point cloud analysis and judges whether the sliding block rotates or not by combining with the mode of detecting the edge of the sliding block.
FIG. 8 is a block diagram illustrating a computer device in accordance with an exemplary embodiment. As shown in fig. 8, an embodiment of the present invention provides a computer device, which includes a processor, a memory and a bus, where the memory stores machine-readable instructions executable by the processor, and when the computer device is running, the processor communicates with the memory through the bus, and the processor executes the machine-readable instructions to perform the steps of the above-mentioned method for identifying an abnormality in any tray position.
Specifically, the memory and the processor can be general-purpose memory and processor, which are not limited to the specific embodiments, and the method for calculating the deviation of the installation position of the laser scanner of the AGV forklift can be performed when the processor runs a computer program stored in the memory.
Those skilled in the art will appreciate that the configuration of the computer device shown in fig. 8 does not constitute a limitation of the computer device and may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
In some embodiments, the computer device may further include a touch screen operable to display a graphical user interface (e.g., a launch interface for an application) and receive user operations with respect to the graphical user interface (e.g., launch operations with respect to the application). A particular touch screen may include a display panel and a touch panel. The Display panel may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), and the like. The touch panel may collect contact or non-contact operations on or near the touch panel by a user and generate preset operation instructions, for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus, etc. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into information capable of being processed by the processor, sends the information to the processor, and receives and executes commands sent by the processor. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may overlay the display panel, a user may operate on or near the touch panel overlaid on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor to determine a user input, and the processor then provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be realized as two independent components or can be integrated.
Corresponding to the starting method of the application program, an embodiment of the present invention further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the steps of the above method for identifying an abnormality in any tray position are performed.
The starting device of the application program provided by the embodiment of the application program can be specific hardware on the device or software or firmware installed on the device. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some communication interfaces, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments provided in the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (13)

1. A method for identifying the position abnormality of a tray is characterized by comprising the following steps:
collecting point cloud data of the tray;
processing point cloud data;
utilizing a 3D point cloud edge detection algorithm to judge the tray state, wherein the tray state comprises the following steps: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
2. The method for identifying the tray position abnormality according to claim 1, wherein the processing of the point cloud data includes:
correcting the inclination of the collected point cloud data according to the calibration data;
converting the tray front end point cloud into a 2D image;
and carrying out regional edge detection on the 2D image by using a 3D point cloud edge detection algorithm, and positioning the boundary of the tray.
3. The method for identifying the tray position abnormality according to claim 2, wherein the determining the tray state by using the 3D point cloud edge detection algorithm includes:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
4. The method for identifying abnormality in tray position according to any one of claims 1 to 3,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
5. An apparatus for detecting a position of a tray, comprising:
the point cloud data acquisition module is used for acquiring point cloud data of the tray;
the point cloud data processing module is used for processing the point cloud data;
the tray abnormity judging module is used for judging the tray state by utilizing a 3D point cloud edge detection algorithm, and the tray state comprises: with or without trays, tray offset, hole collapse, tray slider rotation, beam settling, and beam tilt.
6. The apparatus for detecting a position of a pallet as claimed in claim 5, wherein said point cloud data processing module comprises:
the point cloud data correction module is used for correcting the inclination of the collected point cloud data according to the calibration data;
the image conversion module is used for converting the cloud of the front end point of the tray into a 2D image;
and the tray boundary positioning module is used for carrying out regional edge detection on the 2D image and positioning the boundary of the tray.
7. The apparatus for detecting a tray position as claimed in claim 6, wherein the tray abnormality determination module is specifically configured to:
and positioning the tray boundary in the 2D image by utilizing a region edge searching technology, comparing the result value with the data of the original tray template to obtain the position deviation of the result value and the data of the original tray template, judging that the tray is abnormal if the position deviation is greater than a set threshold value, and otherwise, judging that the tray is abnormal.
8. The apparatus for detecting the position of a tray according to any one of claims 5 to 7,
the tray shifting comprises: a tray left offset and a tray right offset;
the hole collapsing comprises: collapse of the left fork hole and collapse of the right fork hole;
the beam sinking comprises: settling a left cross beam and settling a right cross beam;
the beam tilting comprises: left side beam tilt and right side beam tilt.
9. An intelligent stacker comprises a stacker body, and is characterized by further comprising a 3D camera and an industrial personal computer, wherein the 3D camera photographs a tray and collects point cloud data of the tray; the industrial personal computer is equipped with a computer program which, when running, executes the steps of the method for identifying a pallet position anomaly according to any one of claims 1 to 4.
10. The intelligent stacker of claim 9 wherein the 3D camera is mounted on a base and/or a side support of the stacker.
11. The intelligent stacker according to claim 10 wherein, before the step of performing the method of identifying a pallet position abnormality according to any one of claims 1 to 4, after a calibration plate is placed at a standard position of a pallet, the calibration plate is identified using the reliability image information, a reference pose of a camera is obtained, and a pose relationship between the image pickup device and the pose of the standard pallet is obtained as calibration data.
12. A computer device comprising a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory communicate via the bus when the computer device is running, and the processor executes the machine-readable instructions to perform the steps of the method for identifying a tray position abnormality according to any one of claims 1 to 4.
13. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for identifying a tray position abnormality according to any one of claims 1 to 4.
CN202110742463.5A 2021-06-30 2021-06-30 Intelligent stacker and method, device and equipment for identifying position abnormality of tray Pending CN113435524A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110742463.5A CN113435524A (en) 2021-06-30 2021-06-30 Intelligent stacker and method, device and equipment for identifying position abnormality of tray
PCT/CN2021/121160 WO2023272985A1 (en) 2021-06-30 2021-09-28 Smart stacker crane, and method, apparatus, and device for recognizing anomalous pallet position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110742463.5A CN113435524A (en) 2021-06-30 2021-06-30 Intelligent stacker and method, device and equipment for identifying position abnormality of tray

Publications (1)

Publication Number Publication Date
CN113435524A true CN113435524A (en) 2021-09-24

Family

ID=77758405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110742463.5A Pending CN113435524A (en) 2021-06-30 2021-06-30 Intelligent stacker and method, device and equipment for identifying position abnormality of tray

Country Status (2)

Country Link
CN (1) CN113435524A (en)
WO (1) WO2023272985A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114955342A (en) * 2022-06-02 2022-08-30 浙江中烟工业有限责任公司 Method and system for searching stagnant trays in elevated library
CN114972505A (en) * 2022-04-29 2022-08-30 弥费实业(上海)有限公司 Position recognition system
CN115546202A (en) * 2022-11-23 2022-12-30 青岛中德智能技术研究院 Tray detection and positioning method for unmanned forklift
WO2023272985A1 (en) * 2021-06-30 2023-01-05 兰剑智能科技股份有限公司 Smart stacker crane, and method, apparatus, and device for recognizing anomalous pallet position
CN116040261A (en) * 2022-12-23 2023-05-02 青岛宝佳智能装备股份有限公司 Special tray turnover machine
CN117381804A (en) * 2023-12-13 2024-01-12 珠海格力智能装备有限公司 Automatic material placement part aligning method and device for intelligent transfer robot

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309550B (en) * 2023-05-11 2023-08-04 聊城市飓风工业设计有限公司 Integrated circuit patch abnormality identification method based on image processing
CN117649564B (en) * 2024-01-29 2024-05-14 成都飞机工业(集团)有限责任公司 Aircraft cabin assembly deviation recognition device and quantitative evaluation method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107102009A (en) * 2017-05-04 2017-08-29 武汉理工大学 A kind of method of the cylinder spool quality testing based on machine vision
CN107507167A (en) * 2017-07-25 2017-12-22 上海交通大学 A kind of cargo pallet detection method and system matched based on a cloud face profile
CN108876771A (en) * 2018-06-04 2018-11-23 广东工业大学 A kind of detection method of undercut welding defect
CN110097574A (en) * 2019-04-24 2019-08-06 南京邮电大学 A kind of real-time pose estimation method of known rigid body
CN112001972A (en) * 2020-09-25 2020-11-27 劢微机器人科技(深圳)有限公司 Tray pose positioning method, device and equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110058591A (en) * 2019-04-24 2019-07-26 合肥柯金自动化科技股份有限公司 A kind of AGV system based on laser radar Yu depth camera hybrid navigation
CN110781794A (en) * 2019-10-21 2020-02-11 兰剑智能科技股份有限公司 Intelligent identification method and system
CN110950277A (en) * 2019-12-16 2020-04-03 浙江迈睿机器人有限公司 Tray posture recognition system and method for AGV forklift
CN112070759B (en) * 2020-09-16 2023-10-24 浙江光珀智能科技有限公司 Fork truck tray detection and positioning method and system
CN112614183A (en) * 2020-12-25 2021-04-06 深圳市镭神智能***有限公司 Tray pose detection method, device, equipment and storage medium
CN113435524A (en) * 2021-06-30 2021-09-24 兰剑智能科技股份有限公司 Intelligent stacker and method, device and equipment for identifying position abnormality of tray

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107102009A (en) * 2017-05-04 2017-08-29 武汉理工大学 A kind of method of the cylinder spool quality testing based on machine vision
CN107507167A (en) * 2017-07-25 2017-12-22 上海交通大学 A kind of cargo pallet detection method and system matched based on a cloud face profile
CN108876771A (en) * 2018-06-04 2018-11-23 广东工业大学 A kind of detection method of undercut welding defect
CN110097574A (en) * 2019-04-24 2019-08-06 南京邮电大学 A kind of real-time pose estimation method of known rigid body
CN112001972A (en) * 2020-09-25 2020-11-27 劢微机器人科技(深圳)有限公司 Tray pose positioning method, device and equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WEIXIN_39634480: ""边缘上多条直线拟合_ProSight|直线检测"", 《CSDN》 *
武文汉: ""基于视觉的仓储托盘检测方法研究"", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023272985A1 (en) * 2021-06-30 2023-01-05 兰剑智能科技股份有限公司 Smart stacker crane, and method, apparatus, and device for recognizing anomalous pallet position
CN114972505A (en) * 2022-04-29 2022-08-30 弥费实业(上海)有限公司 Position recognition system
CN114955342A (en) * 2022-06-02 2022-08-30 浙江中烟工业有限责任公司 Method and system for searching stagnant trays in elevated library
CN115546202A (en) * 2022-11-23 2022-12-30 青岛中德智能技术研究院 Tray detection and positioning method for unmanned forklift
CN115546202B (en) * 2022-11-23 2023-03-03 青岛中德智能技术研究院 Tray detection and positioning method for unmanned forklift
CN116040261A (en) * 2022-12-23 2023-05-02 青岛宝佳智能装备股份有限公司 Special tray turnover machine
CN116040261B (en) * 2022-12-23 2023-09-19 青岛宝佳智能装备股份有限公司 Special tray turnover machine
CN117381804A (en) * 2023-12-13 2024-01-12 珠海格力智能装备有限公司 Automatic material placement part aligning method and device for intelligent transfer robot
CN117381804B (en) * 2023-12-13 2024-04-02 珠海格力智能装备有限公司 Automatic material placement part aligning method and device for intelligent transfer robot

Also Published As

Publication number Publication date
WO2023272985A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
CN113435524A (en) Intelligent stacker and method, device and equipment for identifying position abnormality of tray
KR102326097B1 (en) Pallet detection using units of physical length
AU2019247400B2 (en) Method, system, and apparatus for correcting translucency artifacts in data representing a support structure
US8934672B2 (en) Evaluating features in an image possibly corresponding to an intersection of a pallet stringer and a pallet board
US20210041564A1 (en) Position and posture estimation apparatus
US10430969B2 (en) Method for recognizing objects in a warehouse, and industrial truck with an apparatus for recognizing objects in a warehouse
US11336831B2 (en) Image processing device, control method, and program storage medium
WO2020039817A1 (en) Loading operation assistance device for forklift
KR20110027460A (en) A method for positioning and orienting of a pallet based on monocular vision
CN112630786A (en) AGV buffer area inventory method, device and equipment based on 2D laser
JP2013023320A (en) Automated warehouse
US20210276842A1 (en) Warehouse inspection system
JPWO2019163378A5 (en)
WO2021079790A1 (en) Operation assistance device for cargo handling vehicle
JP6690237B2 (en) Automatic warehouse
CN112193241A (en) Automatic parking method
CN110816522B (en) Vehicle attitude control method, apparatus, and computer-readable storage medium
WO2019240273A1 (en) Information processing device, unloading system provided with information processing device, and computer-readable storage medium
JP5217841B2 (en) Image processing apparatus and method for supporting parameter setting relating to defect detection on image
JP7069594B2 (en) Transport system
CN111932576B (en) Object boundary measuring method and device based on depth camera
RU2658092C2 (en) Method and navigation system of the mobile object using three-dimensional sensors
CN117644510A (en) Robot working method, device, equipment and storage medium
CN114590507A (en) Positioning system and positioning method
KR20240065416A (en) Teaching method, program stored in the medium for executing the teaching method and transfer system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210924