CN115589845A - Intelligent cotton picking robot and cotton picking operation path planning method thereof - Google Patents

Intelligent cotton picking robot and cotton picking operation path planning method thereof Download PDF

Info

Publication number
CN115589845A
CN115589845A CN202211027649.3A CN202211027649A CN115589845A CN 115589845 A CN115589845 A CN 115589845A CN 202211027649 A CN202211027649 A CN 202211027649A CN 115589845 A CN115589845 A CN 115589845A
Authority
CN
China
Prior art keywords
cotton
picking
layer
mature
feature extraction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211027649.3A
Other languages
Chinese (zh)
Inventor
苟先太
蒋晓凤
苟嘉轩
曾开心
顾凡
苟瀚文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Tianqi Intelligent Technology Co ltd
Southwest Jiaotong University
Original Assignee
Sichuan Tianqi Intelligent Technology Co ltd
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Tianqi Intelligent Technology Co ltd, Southwest Jiaotong University filed Critical Sichuan Tianqi Intelligent Technology Co ltd
Priority to CN202211027649.3A priority Critical patent/CN115589845A/en
Publication of CN115589845A publication Critical patent/CN115589845A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D46/00Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
    • A01D46/08Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs of cotton
    • A01D46/14Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs of cotton using lint-from-plant pickers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an intelligent cotton picking robot and a cotton picking operation path planning method thereof, wherein the intelligent cotton picking robot comprises a mechanical arm, a mechanical claw, a camera, an ultrasonic module, a fork, wheels, a crawler belt, a cotton inlet, a rotary base and a machine body; the mechanical arm is arranged at the top of the machine body through a rotating base; the mechanical arm is movably connected with the mechanical claw; the camera, the ultrasonic module and the fork are all arranged on the side wall of the machine body; the wheels are arranged at the bottom of the machine body; the outer wall of the wheel is provided with a crawler belt; the cotton inlet is arranged at the top of the machine body. This intelligence cotton picking robot adopts independent cooperative control's method through carrying on a plurality of harvesting arms on picking the platform, increases substantially the work efficiency of picking the operation, reduces the hourglass rate of picking of cotton, accomplishes the integrality and the integrity of picking the cotton simultaneously, reduces the trash content rate of picking the cotton to guarantee the stability of cotton quality.

Description

Intelligent cotton picking robot and cotton picking operation path planning method thereof
Technical Field
The invention belongs to the technical field of cotton picking, and particularly relates to an intelligent cotton picking robot and a cotton picking operation path planning method thereof.
Background
China is a big country for cotton production, and the cotton planting area is wide. The cotton picking in China mainly adopts a manual mode, the stability of the quality of the cotton is guaranteed by manual picking, but the cotton picking machine has the defects of high cost, low efficiency, high labor intensity and the like. In order to realize automation and intellectualization of cotton picking, the cotton picker is gradually applied to cotton harvesting, and mechanical picking liberates labor force to a certain degree and reduces production cost, but has the problems of low cotton picking rate, high impurity content and the like. Meanwhile, because the cotton in China has the characteristics of multiple varieties, short fiber, long mature period and the like, the large-scale mechanical cotton picking can cause the situation of good and bad cotton picking and is not suitable for the national situation of China. Therefore, the development of the cotton picking robot suitable for the national conditions of China has important significance and wide application prospect. The birth of the cotton picking robot not only reduces labor force and cost, but also improves cotton picking efficiency and cotton quality.
At present, the research of cotton picking robots makes active progress in the aspects of cotton identification and positioning, autonomous navigation, robot body structure design and the like. However, in general, the problem of low efficiency still exists, and the practical requirement of large-scale cotton picking is difficult to meet. In order to improve the working efficiency of the cotton picking robot, the invention designs the multi-arm autonomous intelligent high-efficiency high-picking-rate clean cotton picking robot, a plurality of picking mechanical arms are carried on a picking platform, and an independent and cooperative control method is adopted, so that the working efficiency of picking operation is greatly improved while the quality of cotton is ensured.
Disclosure of Invention
The invention provides an intelligent cotton picking robot and a cotton picking operation path planning method thereof, aiming at solving the problem that the working efficiency of the existing single-arm cotton picking robot is generally low.
The technical scheme of the invention is as follows: an intelligent cotton picking robot comprises a mechanical arm, a mechanical claw, a camera, an ultrasonic module, a fork, wheels, a track, a cotton inlet, a rotary base and a machine body;
the mechanical arm is arranged at the top of the machine body through a rotating base; the mechanical arm is movably connected with the mechanical claw; the camera, the ultrasonic module and the fork are all arranged on the side wall of the machine body; the wheels are arranged at the bottom of the machine body; the outer wall of the wheel is provided with a crawler belt; the cotton inlet is arranged at the top of the machine body.
Furthermore, a navigation control module, a power module, a compression bin, a GPS module and a picking control module are arranged in the machine body;
the navigation control module is used for controlling the walking, steering and stopping of the machine body; the power module is used for providing power for the navigation control module; the compression bin is used for compressing cotton at the cotton inlet;
the GPS module is used for detecting the current position information of the robot and the environment information of the cotton field; the picking control module is used for controlling the mechanical arm and the mechanical claw.
The invention has the beneficial effects that: this intelligence cotton picking robot adopts independent cooperative control's method through carrying on a plurality of harvesting arms on picking the platform, increases substantially the work efficiency of picking the operation, reduces the hourglass rate of picking of cotton, accomplishes the integrality and the integrity of picking the cotton simultaneously, reduces the trash content rate of picking the cotton to guarantee the stability of cotton quality.
Based on the system, the invention also provides a cotton picking operation path planning method, which comprises the following steps:
s1: collecting RGB images of a cotton field by using a camera, and preprocessing the RGB images;
s2: establishing a mature cotton recognition neural network model, and recognizing mature cotton in the preprocessed RGB image by using the mature cotton recognition neural network model;
s3: determining the position of the mature cotton;
s4: and (4) performing operation planning on the cotton picking robot according to the position of the mature cotton.
Further, in step S1, a specific method for preprocessing the RGB image is as follows: and sequentially carrying out median filtering and image sharpening on the RGB image.
Further, in the step S2, the mature cotton recognition neural network model includes an input layer, a dual-channel feature extraction layer, a feature fusion layer, a hidden layer and an output layer, which are connected in sequence;
the input layer is used for inputting the preprocessed RGB image into the dual-channel feature extraction layer; the dual-channel feature extraction layer is used for extracting feature vectors of the preprocessed RGB images; the feature fusion layer is used for fusing the feature vectors; the hidden layer is used for identifying the mature cotton according to the fused feature vector; the output layer is used for outputting the number of the identified mature cotton.
Further, the dual-channel feature extraction layer comprises a first feature extraction channel and a second feature extraction channel; an input image of the first feature extraction channel is an RGB image after pretreatment; the input image of the second feature extraction channel is an image obtained by segmenting the preprocessed RGB image; the first feature extraction channel and the second feature extraction channel have the same feature extraction network structure and respectively comprise a first convolution module, a second convolution module and a full connection layer which are connected in sequence; the first convolution module and the second convolution module respectively comprise a first convolution layer, a first pooling layer, a second convolution layer and a second pooling layer which are connected in sequence;
in the dual-channel feature extraction layer, the method for extracting the feature vector by using the first feature extraction channel and the second feature extraction channel is the same as that of the feature vector extraction method, and the method is as follows: performing convolution operation twice and pooling operation twice on the input image by using the first convolution layer, the first pooling layer, the second convolution layer and the second pooling layer to obtain a characteristic matrix of the input image; converting the feature matrix of the input image into a one-dimensional feature vector by using a full connection layer;
in step S2, the expression of the segmentation function f (x) for segmenting the preprocessed RGB image is:
Figure BDA0003816391050000021
where R denotes a green component of the RGB components, B denotes a blue component of the RGB components, and T denotes a set threshold.
Further, the calculation formula of the number n of the neurons in the hidden layer is as follows:
Figure BDA0003816391050000031
wherein n is 0 Indicates the number of neurons in the layer, n i Indicates the number of neurons in the input layer, and a indicates [0,10 ]]An integer in between.
Further, in step S3, the position of the mature cotton in the space coordinate system is calculated by the following formula:
Figure BDA0003816391050000032
wherein X represents the abscissa of the ripe cotton in a space coordinate system, Y represents the ordinate of the ripe cotton in the space coordinate system, Z represents the ordinate of the ripe cotton in the space coordinate system, X represents the abscissa of the ripe cotton in a depth camera coordinate system, Y represents the ordinate of the ripe cotton in the depth camera coordinate system, Z represents the ordinate of the ripe cotton in the depth camera coordinate system, a represents the ordinate of the ripe cotton in the depth camera coordinate system, and x indicating cameraFocal length on the x-axis of the depth camera coordinate system, a y The focal length of the camera on the y axis of the depth camera coordinate system is represented, gamma represents a non-vertical factor of the camera, u represents the horizontal coordinate of the optical axis pixel of the camera, and v represents the vertical coordinate of the optical axis pixel of the camera.
Further, step S4 comprises the following sub-steps:
s41: dividing the operation area of the cotton picking robot into a first picking area, a second picking area, a third picking area and a fourth picking area according to the position of the mature cotton;
s42: and (4) planning cotton picking sequences in all picking areas.
Further, in step S42, the specific method for planning the cotton picking sequence is as follows: all mature cotton sets are obtained, and the calculation formula for cotton picking sequence planning is as follows:
Figure BDA0003816391050000033
where d (i, V ') represents the shortest path length from vertex i through each vertex in the set V' of all mature cotton once and back to vertex i only once, d (k, V '- { k }) represents the shortest path length from vertex k through each vertex in the set V' - { k } once and back to vertex i once,
Figure BDA0003816391050000034
representing the shortest path length from vertex k directly to vertex i, E representing the set of distances, c ik Denotes the distance from vertex i to vertex k, c ki Representing the distance from vertex k to vertex i.
The invention has the beneficial effects that:
(1) The cotton picking operation path planning method provides a mature cotton automatic identification neural network model based on double-channel feature extraction, and improves the accuracy of identification and positioning of mature cotton;
(2) The cotton picking operation path planning method reasonably plans and schedules tasks of all mechanical arms according to the distribution condition of mature cotton; the intelligent autonomous navigation of the robot is realized based on machine vision, and path planning and real-time guidance in the moving process of the cotton picking robot are realized.
Drawings
FIG. 1 is a flow chart of an intelligent cotton picking robot;
FIG. 2 is a flow chart of a cotton picking operation path planning method;
FIG. 3 is a diagram of a neural network structure for identifying mature cotton based on dual channel feature extraction;
FIG. 4 is a schematic diagram of four work area divisions;
FIG. 5 is a schematic view of a task planning partition for four robotic arms;
in the figure, 1, a mechanical arm; 2. a gripper; 3. a camera; 4. an ultrasonic module; 5. a fork; 6. a wheel; 7. a crawler belt; 8. a navigation control module; 9. a power module; 10. compressing the bin; 11. a cotton inlet; 12. rotating the base; 13. a GPS module; 14. a picking control module; 15. a fuselage.
Detailed Description
The embodiments of the present invention will be further described with reference to the accompanying drawings.
As shown in fig. 1, the invention provides an intelligent cotton picking robot, which comprises a mechanical arm 1, a mechanical claw 2, a camera 3, an ultrasonic module 4, a fork 5, wheels 6, a crawler 7, a cotton inlet 11, a rotating base 12 and a machine body 15;
the mechanical arm 1 is arranged at the top of the machine body 15 through a rotating base 12; the mechanical arm 1 is movably connected with the mechanical claw 2; the camera 3, the ultrasonic module 4 and the fork 5 are all arranged on the side wall of the machine body 15; the wheels 6 are arranged at the bottom of the body 15; the outer wall of the wheel 6 is provided with a crawler belt 7; the cotton inlet 11 is arranged at the top of the machine body 15.
In the embodiment of the invention, the machine vision module collects scene images by the depth camera cameras 3 arranged in the front, back, left and right directions of the robot and realizes the identification and positioning of the mature cotton based on the machine vision; the mechanical execution module is composed of 13 picking mechanical arms 1, wherein 4 mechanical arms are respectively arranged in front of, on the left of and on the right of the robot, 1 mechanical arm is arranged behind the picking mechanical arms and is used for cotton missed to be picked in the advancing process of the picking robot and mechanical claws 2 of an end effector of the picking mechanical arms, the end effector is guided to finish the cotton harvesting action through the cooperative work of the mechanical arms, and the cotton is placed into a cotton inlet 11 and a compression bin 10 of a cotton collecting device.
In the embodiment of the invention, as shown in fig. 1, a navigation control module 8, a power module 9, a compression bin 10, a GPS module 13 and a picking control module 14 are arranged in a machine body 15;
the navigation control module 8 is used for controlling the walking, steering and stopping of the machine body 15; the power module 9 is used for providing power for the navigation control module 8; the compression bin 10 is used for compressing cotton at the cotton inlet 11; the GPS module 13 is used for detecting the current position information of the robot and the environment information of the cotton field; the picking control module 14 is used to control the robot arm 1 and the gripper 2.
Based on the system, the invention also provides a cotton picking operation path planning method, which comprises the following steps as shown in figure 2:
s1: collecting RGB images of a cotton field by using a camera, and preprocessing the RGB images;
s2: establishing a mature cotton recognition neural network model, and recognizing mature cotton in the preprocessed RGB image by using the mature cotton recognition neural network model;
s3: determining the position of the mature cotton;
s4: and (4) performing operation planning on the cotton picking robot according to the position of the mature cotton.
In the embodiment of the present invention, in step S1, a specific method for preprocessing an RGB image is: and sequentially carrying out median filtering and image sharpening on the RGB image.
The median filtering is used for eliminating random noise possibly brought in during image acquisition, and protecting edge information of the image while filtering the noise. The purpose of image sharpening is to make the edge of the image steeper and clearer so as to highlight the edge and the outline of the image, and facilitate the subsequent extraction of the cotton outline feature.
In the embodiment of the invention, as shown in fig. 3, in step S2, the mature cotton recognition neural network model includes an input layer, a dual-channel feature extraction layer, a feature fusion layer, a hidden layer and an output layer, which are connected in sequence;
the input layer is used for inputting the preprocessed RGB image to the dual-channel feature extraction layer; the dual-channel feature extraction layer is used for extracting feature vectors of the preprocessed RGB images; the feature fusion layer is used for fusing the feature vectors; the hidden layer is used for identifying the mature cotton according to the fused feature vector; the output layer is used for outputting the number of the identified mature cotton.
In the embodiment of the invention, the dual-channel feature extraction layer comprises a first feature extraction channel and a second feature extraction channel; the input image of the first feature extraction channel is a preprocessed RGB image; the input image of the second feature extraction channel is an image obtained by segmenting the preprocessed RGB image; the first feature extraction channel and the second feature extraction channel have the same feature extraction network structure and respectively comprise a first convolution module, a second convolution module and a full connection layer which are connected in sequence; the first convolution module and the second convolution module respectively comprise a first convolution layer, a first pooling layer, a second convolution layer and a second pooling layer which are connected in sequence; the convolutional layer kernel size is 3 × 3 and the pooling layer kernel size is 2 × 2.
In the dual-channel feature extraction layer, the method for extracting the feature vector by using the first feature extraction channel and the second feature extraction channel is the same as that of the feature vector extraction method, and the method is as follows: sequentially performing convolution operation twice and pooling operation twice on the input image by utilizing the first convolution layer, the first pooling layer, the second convolution layer and the second pooling layer to obtain a characteristic matrix of the input image; converting the feature matrix of the input image into a one-dimensional feature vector by using a full connection layer;
and inputting the image into a double-channel feature extraction layer for multi-scale feature extraction so as to obtain richer feature information of the image. Two feature extraction channels of the two-channel feature extraction layer are respectively as follows: the first feature extraction channel based on the RGB image is mainly used for extracting complex and diverse features in the image; and the second feature extraction channel based on the R-B color difference information is mainly used for extracting the outline features of the cotton. The input images of the first characteristic extraction channel and the second characteristic extraction channel are different, and the characteristic extraction networks are the same.
In step S2, since the color of cotton is greatly different from the background, the color difference is an important basis for distinguishing the cotton from the backgrounds such as cotton leaves and cotton branches, the cotton is segmented from the background based on the color difference (R-B) between the R component and the B component of the RGB image to obtain the basic outline of the cotton, and the expression of the segmentation function f (x) for segmenting the preprocessed RGB image is:
Figure BDA0003816391050000061
where R represents the green component of the RGB component, B represents the blue component of the RGB component, and T represents a set threshold, determined by color statistical analysis.
In the embodiment of the present invention, a calculation formula of the number n of neurons in the hidden layer is:
Figure BDA0003816391050000062
wherein n is 0 Indicates the number of neurons in the layer, n i Represents the number of neurons in the input layer, and a represents [0,10 ]]An integer in between. .
In the embodiment of the present invention, the feature vectors obtained by the first feature extraction channel and the second feature extraction channel are feauture1 (f) 1 ,f 2 ,…,f n ) And feauture2 (g) 1 ,g 2 ,…,g n )。
And at a feature fusion layer, fusing the feature vectors obtained by the two channels, so that the feature vectors contain more effective information. The results are as follows:
F=(f 1 ,f 2 ,…,f n ,g 1 ,g 2 ,…,g n )。
in the embodiment of the present invention, in step S3, a calculation formula of the position of the mature cotton in the spatial coordinate system is:
Figure BDA0003816391050000063
wherein X represents the abscissa of the ripe cotton in a space coordinate system, Y represents the ordinate of the ripe cotton in the space coordinate system, Z represents the ordinate of the ripe cotton in the space coordinate system, X represents the abscissa of the ripe cotton in a depth camera coordinate system, Y represents the ordinate of the ripe cotton in the depth camera coordinate system, Z represents the ordinate of the ripe cotton in the depth camera coordinate system, a represents the ordinate of the ripe cotton in the depth camera coordinate system, and x denotes the focal length of the camera head on the x-axis of the depth camera coordinate system, a y And the focal length of the camera on the y axis of the depth camera coordinate system is represented, gamma represents a non-perpendicularity factor of the camera, u represents an optical axis pixel abscissa of the camera, and v represents an optical axis pixel ordinate of the camera.
After the identification and positioning of the mature cotton are completed, in order to realize multi-arm cooperative picking, the tasks of the mechanical arms need to be reasonably planned and scheduled. The robot is composed of 13 mechanical arms with 5 degrees of freedom in a mechanical structure, wherein 4 mechanical arms are respectively arranged in the front, the left and the right of the robot, and 1 mechanical arm is arranged at the rear and is specially responsible for picking cotton missed to be picked in the process of the robot.
In an embodiment of the present invention, step S4 includes the following sub-steps:
s41: dividing the operation area of the cotton picking robot into a first picking area, a second picking area, a third picking area and a fourth picking area according to the position of the mature cotton;
s42: and (4) planning the cotton picking sequence in each picking area.
In step S42, the specific method for planning the cotton picking sequence is as follows: all mature cotton sets are obtained, and the calculation formula for cotton picking sequence planning is as follows:
Figure BDA0003816391050000071
wherein d (i, V ') represents the length of the shortest path from the vertex i, through each vertex in all the mature cotton sets V', once and only once, and finally back to the vertex i, d: (i, V:)k, V '- { k }) represents the length of the shortest path from vertex k through each vertex in the set V' - { k } once and back to vertex i after one time,
Figure BDA0003816391050000072
denotes the shortest path length from vertex k directly to vertex i, E denotes the set of distances, c ik Denotes the distance from vertex i to vertex k, c ki Representing the distance from vertex k to vertex i.
The operation task planning of the multi-arm collaborative cotton picking robot is as follows:
(1) Four relatively independent operation areas are divided according to the distribution condition of cotton, and different mechanical arms are distributed to operate, so that the operation ranges of the mechanical arms are not crossed, and the problem of conflict in the operation process of the different mechanical arms is effectively avoided.
(2) The planning of the picking sequence of the cotton in the exclusive picking area by each mechanical arm is realized, and the end effector is driven according to the picking sequence to finish the harvesting action until all the mature cotton in the harvesting area is picked.
(3) As the picking scale, the number of the mechanical arms and the operation range are increased, the working areas are not crossed, the working areas where the arms are inaccessible can exist, missed picking is caused, and cotton missed picking is specially responsible for the mechanical arms behind the cotton picking robot.
In the process of dividing four relatively independent operation areas and distributing different mechanical arms to operate, the four relatively independent operation areas are divided according to the distribution condition of cotton, so that the quantity of the cotton in each mechanical arm operation area is relatively balanced, and the method comprises the following steps:
assuming that the total amount of mature cotton in the image is N, the central coordinate c (x) of the image is obtained 0 ,y 0 ). Three-dimensional distribution information of mature cotton is obtained in the cotton picking system, and the number of the mature cotton in the upper half part is counted through the target coordinate position, wherein the number is assumed to be n. The formula is as follows:
Figure BDA0003816391050000073
wherein a is a constant, and an appropriate value is required according to actual conditions. Constantly updating y by the above formula 0 Until N =2/N, the number of ripe cotton in the left half area is counted and x is counted similarly 0 Is updated to finally obtain the number center coordinate c' (x) 0 ,y 0 )。
Assuming that there are 40 mature cottons in the working area, the four working area divisions are shown in fig. 4.
After the four operation areas are divided, each mechanical arm is allocated with a fixed operation area to carry out picking operation. The task planning sub-section for the four robotic arms is shown in fig. 5. Wherein TL is the working area of the first mechanical arm, TR is the working area of the second mechanical arm, BL is the working area of the third mechanical arm, and BR is the working area of the fourth mechanical arm.
The autonomous navigation mobile system consists of a robot mobile platform, a machine vision module, a GPS module and an ultrasonic obstacle avoidance module, and comprises a camera 3, an ultrasonic module 4, wheels 6, a track 7, a power module 9 and a GPS module 13. The robot moving platform adopts a crawler-type traveling mechanism (wheels 6, a crawler 7 and a power module 9), can adapt to various complex road surfaces, and realizes the movement of the cotton picking robot on a cotton field; the machine vision module is used for acquiring scene images and environmental information of a cotton field in real time by a camera 3 arranged in front of the cotton picking robot; the GPS module 13 is used for detecting the current position information of the robot and the environmental information of the cotton ground road surface; the ultrasonic obstacle avoidance module 4 is used for completing obstacle avoidance by ultrasonic ranging sensors arranged in the front, back, left and right directions of the robot. The modules complete path planning and real-time guidance in the moving process of the cotton picking robot together, and autonomous intelligent navigation of the robot is realized.
The realization idea of the autonomous navigation of the robot is as follows: a depth camera arranged in front of the cotton picking robot is used as a sensing element to obtain an image of the surrounding environment, and after image binarization, filtering and other processing, the boundary information is extracted by using the techniques of Hough transformation and the like. The method comprises the steps of acquiring environment information by adopting various sensors (such as a GPS (global positioning system) sensor and an ultrasonic ranging sensor), determining the direction of an obstacle by utilizing a multi-sensor information fusion method and combining a mode identification technology, and planning a robot walking route. The final information of the modules is transmitted to a controller of a navigation control module 8, and the controller controls power modules 9 on the left side and the right side of the crawler-type moving mechanism through path planning optimization to command the robot to autonomously move to a preset position under the condition of no human interference.
The control system consists of a picking control module 14 and a navigation control module 8. The picking control module 14 completes independent and cooperative control of the multiple mechanical arms 1 and control of the mechanical claw 2; the picking control module 14 mainly realizes the control of walking, steering and stopping of the cotton picking robot. When the cotton picking system detects mature cotton, the controller of the picking control module 14 drives the plurality of robot arms 1 and the end effector gripper 2 to complete the cotton picking operation and place the cotton into the cotton collection device (the cotton inlet 11 and the compression bin 10). The controller of the navigation control module 8 needs to receive information transmitted by the controller of the picking control module 14, and when the cotton picking system detects target information of mature cotton, the navigation control module 8 receives 'walking' instruction information from the picking control module 14; when mature cotton is not detected, navigation control module 8 receives a "stop" command message from picking control module 14.
The invention has the beneficial effects that:
(1) The intelligent cotton picking robot is provided with the plurality of picking mechanical arms on the picking platform, and adopts an independent cooperative control method, so that the working efficiency of picking operation is greatly improved, the missing picking rate of cotton is reduced, the integrity and integrity of the picked cotton are realized, the impurity rate of the picked cotton is reduced, and the stability of the quality of the cotton is ensured;
(2) The cotton picking operation path planning method provides a double-channel feature extraction-based mature cotton automatic identification neural network model, and improves the accuracy of identification and positioning of mature cotton;
(3) The cotton picking operation path planning method reasonably plans and schedules tasks of each mechanical arm according to the distribution condition of mature cotton; the intelligent autonomous navigation of the robot is realized based on machine vision, and the path planning and real-time guidance in the moving process of the cotton picking robot are realized.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.

Claims (10)

1. An intelligent cotton picking robot is characterized by comprising a mechanical arm (1), a mechanical claw (2), a camera (3), an ultrasonic module (4), a fork (5), wheels (6), a track (7), a cotton inlet (11), a rotating base (12) and a machine body (15);
the mechanical arm (1) is arranged at the top of the machine body (15) through a rotating base (12); the mechanical arm (1) is movably connected with the mechanical claw (2); the camera (3), the ultrasonic module (4) and the fork (5) are all arranged on the side wall of the machine body (15); the wheels (6) are arranged at the bottom of the machine body (15); the outer wall of the wheel (6) is provided with a crawler belt (7); the cotton inlet (11) is arranged at the top of the machine body (15).
2. The intelligent cotton picking robot according to claim 1, characterized in that a navigation control module (8), a power module (9), a compression bin (10), a GPS module (13) and a picking control module (14) are arranged in the machine body (15);
the navigation control module (8) is used for controlling the walking, steering and stopping of the machine body (15); the power module (9) is used for providing power for the navigation control module (8); the compression bin (10) is used for compressing cotton at the cotton inlet (11); the GPS module (13) is used for detecting the current position information of the robot and the environment information of the cotton field; and the picking control module (14) is used for controlling the mechanical arm (1) and the mechanical claw (2).
3. A cotton picking operation path planning method is characterized by comprising the following steps:
s1: collecting RGB images of a cotton field by using a camera, and preprocessing the RGB images;
s2: constructing a mature cotton recognition neural network model, and recognizing mature cotton in the preprocessed RGB image by using the mature cotton recognition neural network model;
s3: determining the position of the mature cotton;
s4: and performing operation planning on the cotton picking robot according to the position of the mature cotton.
4. The cotton picking operation path planning method according to claim 3, wherein in the step S1, the specific method for preprocessing the RGB image is as follows: and sequentially carrying out median filtering and image sharpening on the RGB image.
5. The cotton picking operation path planning method according to claim 3, wherein in the step S2, the mature cotton recognition neural network model comprises an input layer, a dual-channel feature extraction layer, a feature fusion layer, a hiding layer and an output layer which are connected in sequence;
the input layer is used for inputting the preprocessed RGB image into the dual-channel feature extraction layer; the dual-channel feature extraction layer is used for extracting feature vectors of the preprocessed RGB images; the feature fusion layer is used for fusing feature vectors; the hidden layer is used for identifying the mature cotton according to the fused feature vector; the output layer is used for outputting the number of the identified mature cotton.
6. The cotton picking operation path planning method according to claim 5, wherein the two-channel feature extraction layer comprises a first feature extraction channel and a second feature extraction channel; the input image of the first feature extraction channel is a preprocessed RGB image; the input image of the second feature extraction channel is an image obtained by segmenting a preprocessed RGB image; the first feature extraction channel and the second feature extraction channel have the same feature extraction network structure and respectively comprise a first convolution module, a second convolution module and a full connection layer which are connected in sequence; the first convolution module and the second convolution module respectively comprise a first convolution layer, a first pooling layer, a second convolution layer and a second pooling layer which are connected in sequence;
in the dual-channel feature extraction layer, the method for extracting the feature vector by using the first feature extraction channel and the second feature extraction channel is the same, and the method is as follows: performing convolution operation twice and pooling operation twice on the input image by using the first convolution layer, the first pooling layer, the second convolution layer and the second pooling layer to obtain a characteristic matrix of the input image; converting the characteristic matrix of the input image into a one-dimensional characteristic vector by using a full connection layer;
in step S2, an expression of a segmentation function f (x) for segmenting the preprocessed RGB image is:
Figure FDA0003816391040000021
where R denotes a green component of the RGB components, B denotes a blue component of the RGB components, and T denotes a set threshold.
7. The cotton picking operation path planning method according to claim 5, wherein the calculation formula of the number n of the neurons of the hidden layer is as follows:
Figure FDA0003816391040000022
wherein n is 0 Indicates the number of neurons in the layer, n i Represents the number of neurons in the input layer, and a represents [0,10 ]]An integer in between.
8. The method for planning a cotton picking operation path according to claim 3, wherein in the step S3, the calculation formula of the position of the mature cotton in the space coordinate system is as follows:
Figure FDA0003816391040000023
wherein X represents the abscissa of the mature cotton in a space coordinate system, Y represents the ordinate of the mature cotton in the space coordinate system, Z represents the vertical coordinate of the mature cotton in the space coordinate system, X represents the abscissa of the mature cotton in a depth camera coordinate system, Y represents the ordinate of the mature cotton in the depth camera coordinate system, Z represents the vertical coordinate of the mature cotton in the depth camera coordinate system, a x Representing the focal length of the camera head on the x-axis of the depth camera coordinate system, a y The focal length of the camera on the y axis of the depth camera coordinate system is represented, gamma represents a non-vertical factor of the camera, u represents the horizontal coordinate of the optical axis pixel of the camera, and v represents the vertical coordinate of the optical axis pixel of the camera.
9. A cotton picking operation path planning method according to claim 3, characterized in that the step S4 comprises the following substeps:
s41: dividing the operation area of the cotton picking robot into a first picking area, a second picking area, a third picking area and a fourth picking area according to the position of the mature cotton;
s42: and (4) planning cotton picking sequences in all picking areas.
10. The cotton picking operation path planning method according to claim 9, wherein in the step S42, the specific method for planning the cotton picking sequence is as follows: all mature cotton sets are obtained, and the calculation formula for cotton picking sequence planning is as follows:
Figure FDA0003816391040000031
where d (i, V ') represents the length of the shortest path from vertex i through each vertex in the set V' of all mature cotton once and only once and back to vertex i finally, and d (k, V '- { k }) represents the length of the shortest path from vertex k through each vertex in the set V' - { k }The shortest path length from point once and back to vertex i once,
Figure FDA0003816391040000032
denotes the shortest path length from vertex k directly to vertex i, E denotes the set of distances, c ik Denotes the distance from vertex i to vertex k, c ki Representing the distance from vertex k to vertex i.
CN202211027649.3A 2022-08-25 2022-08-25 Intelligent cotton picking robot and cotton picking operation path planning method thereof Pending CN115589845A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211027649.3A CN115589845A (en) 2022-08-25 2022-08-25 Intelligent cotton picking robot and cotton picking operation path planning method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211027649.3A CN115589845A (en) 2022-08-25 2022-08-25 Intelligent cotton picking robot and cotton picking operation path planning method thereof

Publications (1)

Publication Number Publication Date
CN115589845A true CN115589845A (en) 2023-01-13

Family

ID=84842320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211027649.3A Pending CN115589845A (en) 2022-08-25 2022-08-25 Intelligent cotton picking robot and cotton picking operation path planning method thereof

Country Status (1)

Country Link
CN (1) CN115589845A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117021059A (en) * 2023-10-09 2023-11-10 北京市农林科学院智能装备技术研究中心 Picking robot, fruit positioning method and device thereof, electronic equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117021059A (en) * 2023-10-09 2023-11-10 北京市农林科学院智能装备技术研究中心 Picking robot, fruit positioning method and device thereof, electronic equipment and medium
CN117021059B (en) * 2023-10-09 2024-02-06 北京市农林科学院智能装备技术研究中心 Picking robot, fruit positioning method and device thereof, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN111602517B (en) Distributed visual active perception method for string-type fruits and application of distributed visual active perception method
US11753039B2 (en) Collaborative autonomous ground vehicle
US11406061B2 (en) Automated walnut picking and collecting method based on multi-sensor fusion technology
CN111496770B (en) Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN110122073B (en) Strawberry picking robot based on machine vision
CN102368158B (en) Navigation positioning method of orchard machine
CN108811766B (en) Man-machine interactive greenhouse fruit and vegetable harvesting robot system and harvesting method thereof
CN109699300B (en) Control system and control method of intelligent citrus picking machine
CN203775715U (en) Fruit picking device
CN106863259A (en) A kind of wheeled multi-robot intelligent ball collecting robot
CN115299245B (en) Control method and control system of intelligent fruit picking robot
CN109197160A (en) A kind of guava picking robot and its implementation
CN113207675B (en) Airflow vibration type facility crop automatic pollination device and method
CN115589845A (en) Intelligent cotton picking robot and cotton picking operation path planning method thereof
CN114080905A (en) Picking method based on digital twins and cloud picking robot system
Li et al. A multi-arm robot system for efficient apple harvesting: Perception, task plan and control
CN116494201A (en) Monitoring integrated power machine room inspection robot and unmanned inspection method
CN113812262A (en) Tea-oil camellia fruit picking robot based on machine vision
CN117546681A (en) Collaborative work method, system and platform for guiding robot to pick and transport fruits
CN116257081A (en) Unmanned aerial vehicle and ground unmanned aerial vehicle cooperation emasculation operation system
CN116576859A (en) Path navigation method, operation control method and related device
CN115139315A (en) Grabbing motion planning method for picking mechanical arm
Chandra et al. Garbage detection and path-planning in autonomous robots
LU502977B1 (en) Automated walnut picking and collecting method based on multi-sensor fusion technology
CN216982615U (en) Tomato picking robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination