CN112215139A - Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera - Google Patents

Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera Download PDF

Info

Publication number
CN112215139A
CN112215139A CN202011082122.1A CN202011082122A CN112215139A CN 112215139 A CN112215139 A CN 112215139A CN 202011082122 A CN202011082122 A CN 202011082122A CN 112215139 A CN112215139 A CN 112215139A
Authority
CN
China
Prior art keywords
monocular camera
ultrasonic
cutting knife
module
hoop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011082122.1A
Other languages
Chinese (zh)
Inventor
徐洛冬
张云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN202011082122.1A priority Critical patent/CN112215139A/en
Publication of CN112215139A publication Critical patent/CN112215139A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a cutting knife positioning algorithm based on combination of an ultrasonic sensor and a monocular camera. The algorithm adopts a mode of combining the ultrasonic ranging module with the monocular camera, the relative position of the target rubber tree relative to the three ultrasonic sensors can be accurately and rapidly calculated through multi-ultrasonic data fusion, and the monocular camera can further improve the control precision, so that the cutting knife of the robot is smoothly moved to a proper rubber tapping position, and powerful guarantee is provided for the next rubber tapping task to be smoothly carried out. The invention not only improves the working efficiency and quality of the whole tapping task, but also effectively reduces the labor cost and hardware cost, has the characteristics of low cost and simple algorithm, greatly reduces the research and development cost and complexity of the existing positioning algorithm, and can improve the finally obtained economic benefit.

Description

Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera
Technical Field
The invention relates to the key technical field of rubber tapping robots, in particular to an intelligent crawler-type automatic rubber tapping robot, and particularly relates to design and implementation of a positioning algorithm of a cutting knife on a mechanical arm of the intelligent rubber tapping robot when a rubber tree is cut. In order to control the cutting knife of the tapping robot to reach the initial tapping position more quickly and accurately, the design provides a cutting knife positioning algorithm based on the combination of an ultrasonic sensor and a monocular camera, so that the cutting knife can be positioned quickly and effectively, and the tapping yield is improved.
Background
At present, natural rubber is still the original means of manual rubber tapping, namely, a worker uses a special rubber tapping tool to cut a semicircular oblique incision around a tree at a specific position of the tree, and the rubber is collected after flowing down along the incision and being collected at the bottom. About 2.5 million workers are currently engaged in rubber tapping work in Hainan rubber, the rubber tapping of rubber trees is beneficial to the temperature of about 25 ℃, so the working time of the workers is usually 1 to 6 points in the morning, and the workload of the workers is about 30 trees per day per person. Therefore, the labor cost required by the current tapping work is high, the working efficiency of workers still has a large lifting space, and the operation safety of the workers in the morning is difficult to effectively guarantee under harsh working conditions.
The invention patent of a tapping robot system and a tapping method (CN 110122256A) disclosed by Chinese agriculture university in 8, 16 and 8 months in 2019 uses a depth camera for acquiring the distance between a tapping knife and a rubber tree and the tapping depth of the tapping knife in the tapping process. Analysis shows that identification of rubber tree cutting marks by a depth camera is particularly difficult at night, and has the disadvantages of high cost, more algorithm parameters and frequent adjustment. Meanwhile, the six-degree-of-freedom mechanical arm used in the roaming AUBO-i5 robot industry is extremely expensive in cost of 8.4 thousands, and the mass production is difficult although the stability is guaranteed.
An intelligent tapping robot (CN 210998744U) utility model patent disclosed by Shenyang Xinsong robot Automation GmbH, 14 months, 7.2020, wherein the cutting knife positioning part is identified by using a camera shooting pan-tilt, and the manufacturer is DH-PTZ11204-GN-P, Dahua technology GmbH, Zhejiang. Analysis shows that the camera has good night market capability and moderate price in similar products, and can play a good role in matching with a recognition algorithm. However, the algorithm and cost are inferior to those of the multi-ultrasonic module and monocular camera used in the present invention.
The invention patent of "a tapping robot" (CN 110696008A) published by Beijing information science and technology university in Beijing in 1 month and 17 days of 2020 also relates to ultrasonic distance measurement, but an ultrasonic sensor is only used for positioning the penetration depth of a rubber cutter, in short, the ultrasonic sensor only uses ultrasonic data as a judgment mark, the control action is single, and the part has no innovation.
In short, some finished product tapping robots currently in the market still have many problems, such as high cost and large investment of finished product industrial mechanical arms; the expandability of the commercial robot is not strong; the robot works at night and the positioning of a cutting knife is inaccurate; poor positioning adaptability to trunks growing in different postures and the like. Although the integration degree of the industrial mechanical arm is higher and higher nowadays, the flexibility is greatly improved, but the expandability of the industrial mechanical arm is greatly limited by developers, for example, sensors and functional modules of necessary auxiliary tapping are difficult to integrate into the existing mechanical arm. Due to the problems, the difficulty of identifying the position of the cutting knife by the robot is increased, and the problem to be solved at present is solved urgently. As is well known, the method can quickly and effectively accelerate the working progress and complete the working task, and is the most effective means for improving the economic benefit of the rubber tapping industry.
Disclosure of Invention
In order to reduce the uncertainty caused by manual operation, the positioning precision of a robot cutting knife in the rubber tapping process is improved, and the working stability of the robot cutting knife is enhanced. Meanwhile, in order to effectively reduce the manufacturing cost of the robot and increase the expandability of the robot, the robot can better adapt to various changes of rubber tapping environments in actual work, and the effects of improving the working efficiency and further improving the final rubber tapping yield are achieved.
The technical scheme adopted by the invention for solving the technical problem is as follows: a cutting knife positioning algorithm based on combination of an ultrasonic sensor and a monocular camera is provided. The algorithm comprises an ultrasonic module, a monocular camera module and an auxiliary identification mark on a target rubber tree. The algorithm carries out difference ratio and fusion processing on the ultrasonic module data of a plurality of different positions on the same horizontal plane, calculates the relative position deviation between the hoop supporting device and the target rubber tree, and then carries out preliminary adjustment on the mechanical arm according to the calculated relative position deviation to finish the coarse positioning process. And the matched monocular camera is used for carrying out accurate position matching on the cross auxiliary positioning mark on the target trunk, and the mechanical arm is finely adjusted according to matching information to realize final accurate positioning. The specific control steps are as follows.
The coarse adjustment part procedure is as follows.
Step 1, detecting whether the voltage of the battery is too low, if the voltage is too low, prompting that the battery cannot work due to low electric quantity, otherwise, initializing each sensor and parameter.
And 2, executing the ROS main program, and starting each node manager.
And 3, monitoring the ultrasonic topics and acquiring return data currently detected by the three ultrasonic sensors. If no data exists currently, the monitoring state is kept, the step 3 is repeated, and otherwise, the step 4 is executed.
And 4, executing the ultrasonic callback function and executing the step 5.
And 5, calculating the relative position deviation of the target rubber tree relative to the three ultrasonic sensors by using a difference ratio and an algorithm.
And 6, judging whether the target rubber tree is positioned right in front of the hoop device or not, namely, the ultrasonic position calculation value is in the range of-0.1 and 0.1. If yes, go to step 8, otherwise, go to step 7.
And 7, adjusting the position of the mechanical arm through PID according to the position calculation deviation, and repeating the step 3.
And 8, setting the position of the coarse adjustment completion flag.
The fine tuning part procedure is as follows.
Step 1, detecting whether the voltage of the battery is too low, if the voltage is too low, prompting that the battery is low and cannot work, otherwise initializing each sensor and parameter, setting a color threshold value of the cross auxiliary identification mark, and turning on the lighting device.
And 2, executing the ROS main program, starting each node manager, and starting to issue topics and monitor topic data.
And 3, acquiring a monocular camera image.
And 4, judging whether the coarse positioning is finished at the moment, if so, executing the step 5, and otherwise, executing the step 3.
And 5, executing a monocular camera callback function, and marking the shape with the same size as the cross auxiliary pattern at the center position of the original picture acquired by the sensor to obtain a first preprocessed image.
And 6, converting the original image into an HSV model, carrying out gray level processing according to the color threshold parameter, and shielding objects with other colors in the image to obtain an image II.
And 7, performing convolution filtering processing on the image II, converting the image II into a binary image and storing the binary image as an image III.
And 8, identifying the outline of the white area in the image III, marking the outline on the original image, and simultaneously recording the position of the central point of the identified mark.
And 9, comparing the preprocessed image I obtained in the step 5 with the cross auxiliary identification mark area of the original image processed in the step 8, calculating the coordinate deviation of the central points of the two marks, controlling the mechanical arm to further move forwards and backwards and leftwards and rightwards by using a PID (proportion integration differentiation), adjusting the position to ensure that the centers of the cross marks of the two images are overlapped as much as possible, and ensuring that the total area of the cross marks recognized in the step 8 is equal to the preset cross auxiliary identification mark as much as possible.
And 10, judging whether the image mark of the current monocular camera is aligned with the mark on the target trunk, if so, executing the step 11, otherwise, executing the step 3.
And 11, setting the fine adjustment completion flag bit.
And step 12, ending the program.
Through the two-step operation of coarse adjustment and fine adjustment, the robot can complete the positioning operation of the cutting knife quickly and accurately.
The method has the advantages that the mode that the ultrasonic ranging module is combined with the monocular camera is adopted, the relative position of the target rubber tree relative to the three ultrasonic sensors can be accurately and quickly calculated through multi-ultrasonic data fusion, and the monocular camera can further improve the control precision, so that the cutting knife of the robot can be smoothly moved to a proper rubber tapping position, and powerful guarantee is provided for the next rubber tapping task to be smoothly carried out. The invention not only improves the working efficiency and quality of the whole tapping task, but also can effectively reduce the labor cost and the hardware cost, has the characteristics of low cost and simple algorithm, greatly reduces the research and development cost of the existing positioning algorithm, and improves the finally obtained economic benefit.
Drawings
The various objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments when taken in conjunction with the accompanying drawings.
FIG. 1 is a schematic diagram of the overall structure of the present invention.
FIG. 2 is a schematic view of the back of the hoop support apparatus of the present invention.
FIG. 3 is a top view of the embracing hoop supporting device of the present invention.
FIG. 4 is a flowchart of the coarse tuning and fine tuning procedure of the present invention.
In the drawings: 1-light, 2-first guide rail, 3-first fixed plate, 4-second fixed plate, 5-third fixed plate, 6-gear wheel, 7-pinion, 8-first motor, 9-second motor, 10-third motor, 11-second guide rail, 12-ultrasonic module, 13-monocular camera module, 14-embracing ring strutting arrangement main part, 15-cross auxiliary positioning pattern, 16-rubber tree cut mark.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the embodiment described below is merely one embodiment of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Fig. 1 is a schematic diagram of the overall structure of the robot, and includes an ultrasonic module 12, a monocular camera 13, an illumination device 1, a hoop support device 14, a robot arm capable of moving in three dimensions, and a controller module mounted on the back of the robot, where the controller module uses a BPC-4501L industrial personal computer which can be mounted on the vehicle.
Fig. 2 is a schematic structural diagram of the hoop support device 14. The ultrasonic module 12 consists of three HCSRs 04; the monocular camera 13 is HBV-1716 with a USB interface; the ultrasonic ranging module and the monocular camera 13 are mounted on the hoop supporting device 14, and the hoop supporting device 14 is connected with a mechanical arm capable of moving in three dimensions and can be controlled by a controller to move; the lighting device 1 is fixed at the top end of the first guide rail 2, and can supplement light for the work of the monocular camera during the work at night.
In the cutting knife position positioning algorithm, the controller module is provided with an Ubuntu16.04 system, the monocular camera 13 processes pictures through Opencv, and both ultrasonic data and mechanical arm control instructions subscribe and release data through ROS topics.
In the above cutting knife position positioning algorithm, the ultrasonic ranging module 12 is a common HCSR04 ultrasonic ranging module on the market. The specific installation method is shown in fig. 3: one is installed in the middle position at the top of the embracing ring device 14, and the other two are respectively located at the left end and the right end of the top of the embracing ring device 14, form an included angle of 45 degrees with the connecting line of the two and are symmetrically distributed.
According to the cutting knife position positioning algorithm, the monocular camera 13 module is a 1920 x 1080 USB monocular camera 13. The monocular camera 13 module is arranged right above the cutter starting position by 10cm in the figure 2, and the monocular camera 13 module is fixed at the leftmost position at the top of the hoop 14 and inclines towards the right slightly in the figure 3.
According to the cutting knife position positioning algorithm, the overall embracing ring device 14 is of a semi-cylindrical structure, the diameter of the embracing ring 14 is larger than or equal to the actual trunk diameter, and the embracing ring can be tightly attached to the trunk along with the movement of the mechanical arm.
In the cutting knife position positioning algorithm, the mechanical arm motion module is controlled by the controller module, and can control the hoop 14 to move horizontally, front, back, left and right, up and down, so as to drive the monocular camera 13 and the ultrasonic module 12 to perform data acquisition and cutting knife positioning. The mechanical arm has the specific operation modes as follows: the first motor 8 is arranged in the third fixing plate 5, and the first motor 8 can drive the hoop holding device to sweep left and right through the transmission of the large gear and the small gear. The top of the third fixing plate 5 is fixed with the second fixing plate 4 through a screw, the other side of the third fixing plate is fixed on the first guide rail 2 through a clamping groove, and the first guide rail 2 can be controlled by the second motor 9 to realize the up-and-down movement of the third fixing plate 5. The second motor 9 and the third motor 10 are horizontally fixed on the bottom moving platform and can respectively control the first guide rail 2 moving up and down and the second guide rail 11 moving horizontally and back and forth, the first guide rail 2 is vertically fixed on the second guide rail 11 moving horizontally and back and forth through a clamping groove, the second guide rail 11 is fixed on the bottom moving platform through screws, and the third motor 10 controls the first guide rail to horizontally move back and forth on the second guide rail. The first motor 8, the second motor 9 and the third motor 10 are all direct current brushless motors.
Above-mentioned cutting knife position location algorithm, lighting module 1 adopts 24w common lighting apparatus, installs at first guide rail 2 the top, and under the darker condition of light, this module can carry out the light filling for monocular camera 13, supplementary its discernment cross registration mark 15.
In the above cutting knife position positioning algorithm, the cross auxiliary positioning module 15 is fixed on the rubber tree shown in fig. 2, and is located 10cm above the cutting trace 16.
The specific control steps based on the ultrasonic sensor and the monocular camera 13 in combination with the cutting knife positioning control algorithm are as follows, as shown in fig. 4, a rough adjustment part program.
Step 1, detecting whether the voltage of the battery is too low, if the voltage is too low, prompting that the battery cannot work due to low electric quantity, otherwise, initializing each sensor and parameter.
And 2, executing the ROS main program, and starting each node manager.
And 3, monitoring the ultrasonic topics and acquiring return data currently detected by the three ultrasonic sensors. If no data exists currently, the monitoring state is kept, the step 3 is repeated, and otherwise, the step 4 is executed. Wherein the HC-SR04 ultrasonic module emits ultrasonic waves at the frequency of 40,000Hz, and the ultrasonic waves propagate in the air. If an object or an obstacle is encountered in a path, the ultrasonic wave module collides and rebounds back to the ultrasonic module and is captured, Distance can be obtained by calculating the interval Time and the ultrasonic wave propagation Speed, the following two formulas (1) and (2) can simply calculate the Distance traveled by the ultrasonic wave emitted at the interval, and the final result is half of the calculation result, and the related formula is calculated as follows:
Figure 508953DEST_PATH_IMAGE001
(1)
Figure 957252DEST_PATH_IMAGE002
(2)
Figure 409093DEST_PATH_IMAGE003
(3)。
and 4, executing the ultrasonic callback function and executing the step 5.
And 5, solving the relative position deviation of the target rubber tree relative to the three ultrasonic sensors 12 by using a difference ratio sum algorithm. Wherein the effective range of the distance between the measured data of the ultrasonic sensor in the coarse adjustment part is [3,400 ]]Centimeter, considering that the robot positioning precision in actual work is not greatly deviated and the positioning is more accurate, the numerical value of the sensor is limited to [3,200 ]]In the centimeter range. For convenience of description, the left-most ultrasonic wave in FIG. 3 is described asLMiddle position ultrasonic noteMThe rightmost ultrasonic wave is recorded asRThe data collected by the three sensors can be respectively recorded asL_dataM_dataAndR_data. The method comprises the steps of calculating the relative position deviation between a hoop supporting device and a target rubber tree by carrying out difference ratio and fusion processing on ultrasonic module data of a plurality of different positions on the same horizontal plane, preliminarily judging whether the target rubber tree exists in the right direction of the current hoop, and then preliminarily adjusting a mechanical arm according to the calculated relative position deviation to finish the rough positioning process.
Return value for three sensorsL_dataM_dataAndR_datathe position discrimination value is based on the sum of difference principleValueThe calculation can be made from the following equation:
Figure 335461DEST_PATH_IMAGE004
(4)
Figure 307090DEST_PATH_IMAGE005
(5)
Figure 801657DEST_PATH_IMAGE006
(6)。
in the above formula (6)DumeratorThe target is a molecular item in the ratio, when the target is at the left side position of the hoop device, the detection distance of the left ultrasonic sensor is short, and the numerical value is small; the right sensor has long detection distance and large numerical value; the middle sensor value is greater than the left sensor and less than the right sensor, so the result is a negative number calculated according to the above formula. Similarly, when the target rubber tree is at the position of the right side of the embracing ring device, the calculation result is positive. Specifically, when the target is located at the middle of the hoop device, the calculation result is 0, and the hoop is moved to the desired value
Figure 599848DEST_PATH_IMAGE007
Within the interval, it can be determined that the coarse positioning is initially completed.
In the above formula (6)DenominatorThe denominator in the ratio is always positive. Which is calculated among three sensors.
In the above formula (6), Value is the ratio of the two, and A and B are both proportionality coefficients, whereinA=1.93,B=0.13。
Referring to fig. 3, the sensor layout structure is shown in table 1 below, where the diameter of the rubber tree is 20cm, the diameter of the clasping ring is 20cm, the M ultrasonic sensor is used as the midpoint, the minimum distance between M and the rubber tree trunk is 23 cm, and the clasping ring is rotated left and right to obtain the experimental data.
Table 1 left-right rotation hoop experiment data calculation results:
Figure 720120DEST_PATH_IMAGE009
and 6, judging whether the target rubber tree is positioned right in front of the hoop device, namely the ultrasonic position calculation value is in the range of [ -0.1,0.1 ]. If yes, go to step 8, otherwise, go to step 7.
And 7, adjusting the position of the mechanical arm through PID according to the position calculation deviation, and repeating the step 3.
And 8, setting the position of the coarse adjustment completion flag.
As shown in fig. 2, the monocular camera 13 finely adjusts the cutting knife position by recognizing the cross auxiliary positioning mark 15 on the target rubber tree. A red cross-shaped auxiliary positioning pattern 15 is attached to the target rubber tree 10cm above the first tapping start position 16, and a monocular camera 13 is also mounted 10cm above the initial position of the cutter. The area size and the central point pixel position which should be reached by the cross auxiliary positioning pattern 15 in the image of the monocular camera 13 are marked in advance, the actual size of the pattern observed in the visual field of the monocular camera 13 is consistent with the due target area of the target pattern calibrated in the camera by continuously adjusting the front and back positions of the monocular camera 13, and finally the position of the target pattern in the central point position of the camera is coincident with the central point position of the preset pattern by moving the position of the monocular camera 13 up and down and left and right, so that the fine adjustment of the position of the cutting knife is completed.
Referring to fig. 4, the fine tuning part of the program has the following steps.
Step 1, detecting whether the voltage of the battery is too low, if the voltage is too low, prompting that the battery is low and cannot work, otherwise initializing each sensor and parameter, setting a color threshold value of the cross auxiliary identification mark, and turning on the lighting device.
And 2, executing the ROS main program, starting each node manager, and starting to issue topics and monitor topic data.
And 3, acquiring a monocular camera image.
And 4, judging whether the coarse positioning is finished at the moment, if so, executing the step 5, and otherwise, executing the step 3.
And 5, executing a monocular camera callback function, and marking the shape with the same size as the cross auxiliary pattern at the center position of the original picture acquired by the sensor to obtain a first preprocessed image.
And 6, converting the original image into an HSV model, carrying out gray level processing according to the color threshold parameter, and shielding objects with other colors in the image to obtain an image II.
And 7, performing convolution filtering processing on the image II, converting the image II into a binary image and storing the binary image as an image III.
And 8, identifying the outline of the white area in the image III, marking the outline on the original image, and simultaneously recording the position of the central point of the identified mark.
And 9, comparing the preprocessed image I obtained in the step 4 with the cross auxiliary identification mark area of the original image processed in the step 8, calculating the coordinate deviation of the central points of the two marks, controlling the mechanical arm to further move forwards and backwards and leftwards and rightwards by using a PID (proportion integration differentiation), adjusting the position to ensure that the centers of the cross marks of the two images are overlapped as much as possible, and ensuring that the total area of the cross marks recognized in the step 8 is equal to the preset cross auxiliary identification mark as much as possible.
And 10, judging whether the image mark of the current monocular camera is aligned with the mark on the target trunk, if so, executing the step 11, otherwise, executing the step 3.
And 11, setting the fine adjustment completion flag bit.
And step 12, ending the program.
The principle and the implementation mode of the present invention are explained by applying specific examples in the present specification, and the above descriptions of the examples are only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In summary, this summary should not be construed to limit the present invention.

Claims (5)

1. A cutting knife positioning algorithm based on combination of an ultrasonic sensor and a monocular camera comprises an ultrasonic module, a monocular camera module and an auxiliary identification mark on a target rubber tree; the algorithm carries out difference ratio and fusion processing on the ultrasonic module data at a plurality of different positions on the same horizontal plane, further calculates the relative position deviation between the hoop supporting device and the target rubber tree, and then carries out preliminary adjustment on the mechanical arm according to the calculated relative position deviation to finish the coarse positioning process; and the matched monocular camera is used for carrying out accurate position matching on the cross auxiliary positioning mark on the target trunk, and the mechanical arm is finely adjusted according to matching information to realize final accurate positioning.
2. The cutting knife positioning algorithm based on the combination of the ultrasonic sensor and the monocular camera as set forth in claim 1, wherein the program control flow comprises a coarse adjustment and a fine adjustment, and the fine adjustment is performed immediately after the coarse adjustment.
3. The cutting knife positioning algorithm based on the combination of the ultrasonic sensor and the monocular camera as claimed in claim 1, wherein the ultrasonic module is specifically installed as shown in fig. 1, 2 and 3, the ultrasonic module M is installed at the middle position of the top of the hoop device, and the ultrasonic waves L, R are respectively located at the left and right ends of the top of the hoop, respectively form an included angle of 45 degrees with the connecting line of the two, and are symmetrically distributed about the hoop.
4. The cutting knife positioning algorithm based on the combination of the ultrasonic sensor and the monocular camera is characterized in that the monocular camera module is specifically installed as shown in fig. 1, 2 and 3, the monocular camera is adjacent to the left ultrasonic sensor of the hoop device, and the direction of the monocular camera module is deflected to the right by about 10 degrees at the position of the left side of the ultrasonic wave L.
5. The ultrasonic module according to claim 3, wherein the ultrasonic acquisition data is fused to obtain a relative position deviation Value, which is used as distance information to be corrected by the robot arm, and the robot arm is controlled to adjust according to the deviation.
CN202011082122.1A 2020-10-12 2020-10-12 Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera Pending CN112215139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011082122.1A CN112215139A (en) 2020-10-12 2020-10-12 Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011082122.1A CN112215139A (en) 2020-10-12 2020-10-12 Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera

Publications (1)

Publication Number Publication Date
CN112215139A true CN112215139A (en) 2021-01-12

Family

ID=74053229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011082122.1A Pending CN112215139A (en) 2020-10-12 2020-10-12 Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera

Country Status (1)

Country Link
CN (1) CN112215139A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114800577A (en) * 2022-06-08 2022-07-29 北方民族大学 Valve body casting head cutting positioning device and positioning method
CN118238156A (en) * 2024-03-14 2024-06-25 深圳市远望工业自动化设备有限公司 Tool motion control method and related device of five-axis machining robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105961139A (en) * 2016-07-13 2016-09-28 武汉市享昱科技有限公司 Control system of automatic rubber tapping robot
CN106034980A (en) * 2016-07-13 2016-10-26 武汉市享昱科技有限公司 Automatic rubber cutting robot
US20180035606A1 (en) * 2016-08-05 2018-02-08 Romello Burdoucci Smart Interactive and Autonomous Robotic Property Maintenance Apparatus, System, and Method
CN110122256A (en) * 2019-05-21 2019-08-16 中国农业大学 A kind of rubber tapping robot system and method for tapping rubber
CN110696008A (en) * 2019-10-14 2020-01-17 北京信息科技大学 Rubber tapping robot
CN111448967A (en) * 2020-04-26 2020-07-28 北京金自能源科技发展有限公司 Movable rubber tapping robot
CN111469149A (en) * 2020-04-13 2020-07-31 中国农业大学 Robot end effector based on hand-eye servo

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105961139A (en) * 2016-07-13 2016-09-28 武汉市享昱科技有限公司 Control system of automatic rubber tapping robot
CN106034980A (en) * 2016-07-13 2016-10-26 武汉市享昱科技有限公司 Automatic rubber cutting robot
US20180035606A1 (en) * 2016-08-05 2018-02-08 Romello Burdoucci Smart Interactive and Autonomous Robotic Property Maintenance Apparatus, System, and Method
CN110122256A (en) * 2019-05-21 2019-08-16 中国农业大学 A kind of rubber tapping robot system and method for tapping rubber
CN110696008A (en) * 2019-10-14 2020-01-17 北京信息科技大学 Rubber tapping robot
CN111469149A (en) * 2020-04-13 2020-07-31 中国农业大学 Robot end effector based on hand-eye servo
CN111448967A (en) * 2020-04-26 2020-07-28 北京金自能源科技发展有限公司 Movable rubber tapping robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHUNLONG ZHANG等: "A Rubber-Tapping Robot Forest Navigation and Information Collection System Based on 2D LiDAR and a Gyroscope", 《MDPI》, vol. 19, no. 9, pages 1 - 21 *
汪雄伟: "固定式全自动智能控制橡胶割胶机设计", 《农业工程》, vol. 10, no. 7, pages 79 - 84 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114800577A (en) * 2022-06-08 2022-07-29 北方民族大学 Valve body casting head cutting positioning device and positioning method
CN114800577B (en) * 2022-06-08 2023-08-15 北方民族大学 Valve body casting riser cutting and positioning device and positioning method
CN118238156A (en) * 2024-03-14 2024-06-25 深圳市远望工业自动化设备有限公司 Tool motion control method and related device of five-axis machining robot

Similar Documents

Publication Publication Date Title
CN110370286B (en) Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera
CN108747132B (en) Autonomous mobile welding robot vision control system
US8706300B2 (en) Method of controlling a robotic tool
CN112215139A (en) Cutting knife positioning algorithm based on combination of ultrasonic sensor and monocular camera
CN107860366B (en) Mobile greenhouse crop information measurement and diagnosis system
CN105728972A (en) Concave-convex angle-variable welding joint self-adaptive tracking control device and method
CN108406123A (en) 3 d part calibration system and method in a kind of laser processing
CN109702290B (en) Steel plate groove cutting method based on visual identification
CN110281231B (en) Three-dimensional vision grabbing method for mobile robot for unmanned FDM additive manufacturing
CN112363503B (en) Orchard vehicle automatic navigation control system based on laser radar
CN104408408A (en) Extraction method and extraction device for robot spraying track based on curve three-dimensional reconstruction
CN110293559B (en) Installation method for automatically identifying, positioning and aligning
CN110651686B (en) Tapping method and system based on tapping mechanical arm
JP2903964B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
CN115014338A (en) Mobile robot positioning system and method based on two-dimensional code vision and laser SLAM
CN110232710B (en) Article positioning method, system and equipment based on three-dimensional camera
CN112470735B (en) Regular-shape nursery stock automatic trimming device and method based on three-dimensional positioning
CN104019761A (en) Three-dimensional configuration obtaining device and method of corn plant
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
CN110488309A (en) The target Calibration Method of foreign matter device is removed based on laser
CN110378901B (en) Spherical flower and tree center measurement and pruning control method based on depth camera
CN113063349B (en) Rubber tree cutting point detection system and detection method
CN204288242U (en) Based on the Control During Paint Spraying by Robot trajectory extraction device that curved three-dimensional is rebuild
CN112720449A (en) Robot positioning device and control system thereof
CN207549146U (en) With reference to the four side shaping processing unit (plant) of flat work of vision positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210112