CN219161992U - Intelligent plant monitoring robot - Google Patents

Intelligent plant monitoring robot Download PDF

Info

Publication number
CN219161992U
CN219161992U CN202223443125.0U CN202223443125U CN219161992U CN 219161992 U CN219161992 U CN 219161992U CN 202223443125 U CN202223443125 U CN 202223443125U CN 219161992 U CN219161992 U CN 219161992U
Authority
CN
China
Prior art keywords
module
chassis
computing device
plant
information acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202223443125.0U
Other languages
Chinese (zh)
Inventor
刘禹
葛亚鹏
谷逸明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN202223443125.0U priority Critical patent/CN219161992U/en
Application granted granted Critical
Publication of CN219161992U publication Critical patent/CN219161992U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The application discloses intelligent plant monitoring robot specifically includes: the system comprises a mobile module, an information acquisition module and a central control module. The mobile module comprises a chassis, and at least one driving wheel is arranged on two sides of the chassis; the information acquisition module sets up on the chassis, just the information acquisition module includes: the monitoring device comprises a mechanical arm and a monitoring end arranged on the mechanical arm; the mechanical arm is used for driving the monitoring end to move towards the plant to be monitored; the central control module is arranged on the chassis and is electrically connected with the information acquisition module; the central control module comprises an information processing unit and a driving control unit; the method replaces manual work to complete the monitoring task of the plant growth period, avoids the defects that the crop growth state and the plant diseases and insect pests in the traditional agriculture are monitored by visual inspection of farmers, needs a certain expert experience, has the advantages of strong subjectivity, low identification accuracy and the like, liberates manpower, improves accuracy and releases boosting capacity.

Description

Intelligent plant monitoring robot
Technical Field
The present disclosure relates generally to the field of plant monitoring devices, and in particular to an intelligent plant monitoring robot.
Background
Along with the popularization of agricultural modernization technology and the rapid development of plant factories, the plant factory intellectualization becomes one of the development directions of intelligent agriculture. The plant factory is a sustainable production system adopting a three-dimensional cultivation technology, provides optimal growth parameters for crop growth by utilizing a high-precision environment control system, can thoroughly get rid of the constraint of natural conditions, performs pipeline production, and has plant yield tens and hundreds of times higher than that of the traditional agriculture.
One of the characteristics of the plant factory is that the number of plants is large, the distribution is dense, and the plant growth state and the plant pest diagnosis depend on the collection of visual information, so that an intelligent plant monitoring system is developed, the visual information of the plants can be flexibly collected, and the automatic analysis and judgment result is wide in market prospect. The original method relying on manual inspection and on-site diagnostics is inefficient and not adaptable to large-scale plant factories.
The large-scale environment monitoring internet of things system which is adopted in recent years mainly adopts fixed cameras and sensors, and has the problems that the hardware cost is high, the circuit arrangement equipment is complex, and the factory can not be flexibly used and expanded. Therefore, the visual appearance of different stages of plant growth and development is studied in depth, the phenomena of plant diseases and insect pests are analyzed, a visual monitoring system suitable for a plant factory is developed, an efficient and accurate judging result is provided for technicians to refer to for judging the plant growth condition, and the method is an important development direction in the future
The intelligent plant monitoring robot is developed on the basis of neural network analysis and judgment and a flexible movable robot chassis. Chinese patent: CN104092975a, publication date: the 2014 10 month 08 discloses a plant monitoring device, which is a traditional visual monitoring scheme, and can only analyze the growth curve of plants and has insufficient analysis intelligence. Chinese patent: CN208580322U, publication date: the 2019 03 month 05 discloses a portable plant factory monitoring system based on big data, and the hardware aspect of the system is not used for explaining the gesture of integrating the system together to realize the portable function characteristics. Chinese patent: CN103676858A, publication date: the 2014, 03 and 26 days discloses a plant monitoring device and a plant monitoring system, which mainly take flowerpots as mounting hardware, so that one-to-one monitoring is low in production efficiency of large plants and inflexible; to this end, we provide an intelligent plant monitoring robot to solve the above-mentioned problems.
Disclosure of Invention
In view of the above-described drawbacks or shortcomings of the prior art, it is desirable to provide an intelligent plant monitoring robot that can monitor plant growth, remotely acquire monitoring data, and move flexibly.
In a first aspect, the present application provides an intelligent plant monitoring robot comprising:
the mobile module comprises a chassis, and at least one driving wheel is arranged on two sides of the chassis;
the information acquisition module, the information acquisition module set up in on the chassis, just the information acquisition module includes: the device comprises a mechanical arm and a monitoring end arranged on the mechanical arm; the mechanical arm is used for driving the monitoring end to move towards the plant to be monitored;
the central control module is arranged on the chassis and is electrically connected with the information acquisition module; the central control module comprises an information processing unit and a driving control unit;
the information processing unit is used for receiving plant information acquired by the monitoring end; the driving control unit is provided with a plurality of control driving ends and is used for controlling the driving wheels and the mechanical arm to act.
According to the technical scheme provided by the embodiment of the application, the central control module comprises: the edge computing device and the embedded computing device are matched to form the information processing unit and the driving control unit; the edge computing device and the embedded computing device communicate information with the data center through the DTU unit.
According to the technical scheme provided by the embodiment of the application, the mobile module further comprises:
the bearing wheels are rotatably arranged on two sides of the chassis along the axis of the bearing wheels;
the crawler belt is arranged outside the bearing wheel and the driving wheel.
According to the technical scheme provided by the embodiment of the application, the information acquisition module further comprises:
the camera module is a binocular camera and forms the monitoring end; the binocular camera comprises a left camera and a right camera, and the left camera and the right camera are USB cameras with 130W pixels.
According to the technical scheme provided by the embodiment of the application, the intelligent control device further comprises a sensor module, wherein the sensor module is arranged on the chassis through a rotary steering engine and is electrically connected with the central control module; the sensor module is used for identifying road conditions.
According to the technical scheme provided by the embodiment of the application, the device further comprises a display module, wherein the display module is arranged on the chassis and is connected with the edge computing equipment through an HDMI wire; the display module is used for displaying parameters of an operating system interface and system control.
According to the technical scheme provided by the embodiment of the application, the device further comprises a radar module, wherein the radar module is arranged on the chassis and is electrically connected with the edge computing equipment.
According to the technical scheme provided by the embodiment of the application, the intelligent control system further comprises a driving control module, wherein the driving control module is electrically connected with the embedded computing equipment and is used for controlling the operation of the motor.
To sum up, the technical scheme specifically discloses an intelligent plant monitoring robot, specifically includes: the system comprises a mobile module, an information acquisition module and a central control module; the mobile module specifically comprises a chassis, wherein at least one driving wheel is arranged on two sides of the chassis and used for completing flexible movement of the detection robot; the information acquisition module sets up on the chassis, and the information acquisition module includes: the plant monitoring system comprises a mechanical arm and a monitoring end arranged on the mechanical arm, wherein the mechanical arm is used for driving the monitoring end to move towards a plant to be monitored to collect information of the plant to be monitored, and specific plant information can comprise plant leaf color, growth state and the like; the central control module is arranged on the chassis and is electrically connected with the information acquisition module, and is used as a main control module and comprises an information processing unit and a driving control unit; the information processing unit is used for receiving the plant information acquired by the monitoring end and sending the plant information to the data center, and the data center further judges the plant information to realize intelligent analysis of the plant condition; the driving control unit is provided with a plurality of control driving ends and is used for controlling the driving wheels and the mechanical arm to act.
The intelligent plant detection robot is utilized to replace an artificial monitoring task for completing a plant growth period, a mobile module, an information acquisition module and a central control module which are arranged on the intelligent plant detection robot are utilized, computer vision, image recognition and other technologies are applied to plant growth monitoring based on deep learning, the recognition accuracy of plants can reach 94%, the condition that crops grow in traditional agriculture and plant diseases and insect pests are monitored mostly by visual manual inspection of farmers are avoided, certain expert experience is needed, the defects of strong subjectivity, low recognition accuracy and the like are overcome, manpower is liberated, accuracy is improved, and the power-assisted productivity is released. In addition, the module is integrated on the detection robot chassis, and the high efficiency and manual flexibility of the fixing equipment are also considered.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings, in which:
fig. 1 is a schematic mechanical structure of an intelligent plant monitoring robot.
Fig. 2 is a schematic structural diagram of an intelligent plant monitoring robot.
Fig. 3 is a side view of the mechanical structure of an intelligent plant monitoring robot.
Fig. 4 is a top view of the mechanical structure of an intelligent plant monitoring robot.
Fig. 5 is a side view of a robotic arm in an intelligent plant monitoring robot.
Fig. 6 is a top view of a robotic arm in an intelligent plant monitoring robot.
Reference numerals in the drawings: 1. a power module; 2. a display module; 3. a chassis; 4. a motor; 5. a camera module; 6. a radar module; 7. a mechanical arm; 8. a central control module; 8-1, edge computing device; 8-2, embedding the computing device; 9. a drive control module; 10. a DTU unit; 11. and a sensor module.
Detailed Description
The present application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the utility model and are not limiting of the utility model. It should be noted that, for convenience of description, only the portions related to the utility model are shown in the drawings.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Example 1
Please refer to fig. 1 for a mechanical schematic diagram of a first embodiment of an intelligent plant monitoring robot and fig. 2 for a schematic structural diagram of a principle, which includes:
the mobile module comprises a chassis 3, and at least one driving wheel is arranged on two sides of the chassis 3;
the information acquisition module, the information acquisition module sets up on chassis 3, and the information acquisition module includes: the system comprises a mechanical arm 7 and a monitoring end arranged on the mechanical arm 7; the mechanical arm 7 is used for driving the monitoring end to move towards the plant to be monitored;
the central control module 8 is arranged on the chassis 3 and is electrically connected with the information acquisition module; the central control module 8 comprises an information processing unit and a drive control unit;
the information processing unit is used for receiving the plant information acquired by the monitoring end; the drive control unit is provided with a plurality of control driving ends for controlling the driving wheels and the mechanical arm 7 to act.
In the embodiment, the chassis in the mobile module is a main bearing structure, and each module is integrated on the chassis, so that the robot completes high automation; meanwhile, driving wheels are arranged on two sides of the chassis and connected with a motor 4 below the chassis 3, and the driving wheels are matched with the bearing wheels under the driving of the motor 4, so that flexible movement of the intelligent robot is completed.
The information acquisition module sets up on chassis 3, and the information acquisition module includes: the mechanical arm 7 and the monitoring end arranged on the mechanical arm 7, wherein the mechanical arm 7 is used for driving the monitoring end to move towards the plant to be monitored; optionally, the mechanical arm 7 can adopt a six-axis mechanical arm to complete a plurality of complex operations, and the mechanical arm 7 is composed of a high-precision steering engine in the practical application process, so that the operation precision of the mechanical arm 7 is ensured.
The steering engine is powered by 5V, and is controlled by PWM signals with specific frequency in rotation, and the rotation of the steering engine is controlled by changing the duty ratio of the steering engine. Specifically, the steering engine selected by the robot comprises 360 degrees and 180 degrees, and various operations required by the robot are guaranteed.
The central control module 8 is arranged on the chassis 3 and is electrically connected with the information acquisition module, the central control module 8 comprises an information processing unit and a driving control unit, the central control module 8 can be used for receiving plant information acquired by the monitoring end and transmitting the plant information to the data center, the data center is used for further analyzing the plant information, and the driving control unit is provided with a plurality of control driving ends and is used for controlling the driving wheels and the mechanical arm 7 to act, so that the whole robot can carry out highly-automated moving and monitoring tasks according to control instructions of people.
Specifically, the central control module 8 includes: the edge computing device 8-1 and the embedded computing device 8-2 cooperate to form the information processing unit and the drive control unit; the edge computing device 8-1 and the embedded computing device 8-2 communicate with the data center through a DTU unit 10, where DTU is an abbreviation of DataTransferUnit, and is a wireless terminal device specifically used for converting serial data into IP data or converting IP data into serial data for transmission through a wireless communication network.
Further, the type of edge computing device 8-1, optionally using Jetsonnano, can provide up to 22TOPS of computing power, fully conforming to the neural network system's demand for computing power, with a 5V power supply from a 24V battery through a buck-boost module. The type of embedded computing device 8-2, optionally using ESP32, its dual-core 240MHZ frequency is fully capable of processing tasks in real-time, and it has integrated Wi-Fi and bluetooth 4.0 modules on-chip to facilitate communication with the edge computing device 8-1, host; the DTU unit 10 uses a "cloud with person" data transmission terminal to transmit and receive data.
Furthermore, the edge computing device 8-1 controls based on the ubuntu16.04 operating system and carries on the ROS system; the edge computing device 8-1 collects data of the radar module 6, performs SLAM mapping through a ***-cartographer mapping algorithm, enables the robot to navigate autonomously and avoid obstacles by using a ros-navigation function package set, performs global positioning of the robot by using an amp particle filtering algorithm, and uses a global path planner based on an A-algorithm as a path planning of the robot. The autonomous navigation and obstacle avoidance work of the robot is completed, the control information calculated by the edge computing equipment 8-1 through various algorithms is transmitted to the embedded computing equipment 8-2, the embedded computing equipment 8-2 completes driving control, namely, the embedded computing equipment 8-2 outputs PWM waves with different duty ratios in real time according to the control information to control the direct current motor 4 to complete the walking task of the robot, the robot is more flexible, a plurality of plants in a certain area can be detected in a reciprocating cycle mode, meanwhile, the robot is finer, and the efficiency is improved. Meanwhile, the plant image collected by the monitoring end is transmitted to the edge computing equipment 8-1 through the USB line, the edge computing equipment 8-1 performs neural network reasoning, the types of plants, different growth states of the blades and health states of the blades are judged according to the deduced information, and the plant possibly suffering from plant diseases and the growth state results of the plants after reasoning are transmitted back to the data center, and the data center performs further judgment.
Wherein the ROS system is an open source meta-operating system suitable for robots. It provides services that the operating system should have, including hardware abstraction, underlying device control, implementation of common functions, inter-process messaging, and packet management. It also provides the tools and library functions required to obtain, compile, write, and run code across computers. The intelligent robot may utilize the ROS system for task distribution and message streaming.
Specifically, the mobile module further includes:
the bearing wheels are rotatably arranged on two sides of the chassis 3 along the axis of the bearing wheels;
the crawler belt is arranged outside the bearing wheel and the driving wheel.
The mobile module is mainly embodied in the aspect of the advancing of the monitoring robot, the two sides of the intelligent plant monitoring robot are provided with the tracks, in the practical application process, each track comprises 5 smaller bearing wheels and a larger driving wheel, the driving wheels are connected with a motor 4 under the chassis 3 to form a travelling mechanism, and the travelling mechanism is driven by the embedded computing equipment 8-2 to complete the appointed control; the design of the crawler is that the crawler can cross most obstacles by taking the complexity of the ground on which the robot runs into consideration, and the crawler is tested, wherein the transmission of the crawler is controlled by a brush direct current motor through motor driving, the current required by the motor operation can be provided by a driving control module 9, and the forward and reverse rotation of the motor 4 are controlled through the state of a motor driving chip pin 0-1.
As shown in fig. 5 and 6, specifically, the information acquisition module further includes:
the camera module 5 is arranged on the mechanical arm 7, and the camera module 5 is a binocular camera and forms the monitoring end; the binocular camera consists of a left camera and a right camera, and the left camera and the right camera are USB cameras with 130W pixels; the camera module 5 is mainly used for collecting plant images, transmitting the plant images to the edge computing device 8-1 through a USB line, and carrying out reasoning of a follow-up neural network by the edge computing device 8-1.
It should be noted that the binocular depth camera can return the distance information of the target, can help the robot to avoid obstacle, can acquire 1080p high-resolution images, provide more plant details, and is convenient for the network to perform reasoning analysis. The reasoning of the network is carried on the edge computing equipment 8-1 (Jetsonnano) of the robot, the performance of the network can fully meet the operation requirement of the network, the network reasoning can be accelerated by the GPU, and the recognition speed can fully meet the requirement of daily robot work through testing.
Specifically, the road condition recognition device further comprises a sensor module 11, wherein the sensor module 11 is arranged on the chassis 3 through a rotary steering engine and is electrically connected with the central control module 8, and the sensor module 11 is used for recognizing the road condition.
The sensor module 11 is mainly embodied in the aspect of communication of a robot, and various sensors (temperature, humidity, illumination sensors and the like can be selected according to requirements) are connected with the central control module 8 by using USB, so that collected plant information is transmitted back to the central control module 8; alternatively, during actual operation, sensors are classified into two types, one in which the collected information needs to be processed in real time, such as a nine-axis gyroscope accelerometer, and one in which the collected information does not need to be processed in real time, such as a sensor for detecting weather and soil conditions, and a camera. The data to be processed in real time is delivered to the embedded computing device 8-2 for real-time processing, and the data after real-time calculation is transmitted to the edge computing device 8-1 for analysis and then subsequent control is performed. The data which do not need to be processed in real time are transmitted to the edge computing device 8-1 through the USB interface to analyze the data, the data are transmitted to the DTU unit 10 through the serial interface, and the DTU unit 10 transmits the information to the data center through the 4G network.
As shown in fig. 3, specifically, the display module 2 is further included, the display module 2 is disposed on the chassis 3, and is connected with the edge computing device 8-1 through a high definition multimedia interface (HighDefinitionMultimedia Interface, HDMI); the display module 2 is used for displaying parameters of an operating system interface and system control.
Further, the display module 2 is arranged at the highest position above the sensor module 11, and is provided with a seven-inch screen display module 2to form a convenient man-machine interaction mechanism; the edge computing device 8-1 is connected to the display module 2 through an HDMI line, an operating system interface and various system control parameters are displayed through the display module 2, the drive control module is controlled by the embedded computing device 8-2 with lower computing power, and some sensor data which needs to be processed in real time need to be processed by the embedded computing device 8-2 and then sent to the edge computing device 8-1, for example, IMU data. The method and the device can accurately know the growth state of the plant and measure parameters required by the growth of the plant in different stages, and improve the degree of automation of traditional agriculture to a certain extent.
The display module 2 is mainly embodied in the aspect of screen display of the robot, and in the practical application process, a series of user interfaces (UserInterface, UI) are designed for the plant system, so that a user can conveniently operate the plant system, and the UI interfaces comprise real-time display of sensor information acquired by the robot, and the user can conveniently check the plant state. The robot power displays, robot work status and information, including which part of the area the robot has completed detection, the target area being detected, and also can display the detected plant information recorded by the robot. Besides the display of the robot information, the application also provides an interface for the user to operate, which comprises a step of commanding the robot to return to a charging area and exit from a working state, wherein the user designates a detected target area (the user provides a target place, the robot can automatically conduct path planning or the robot scans a two-dimensional code to obtain the task content of the next part), and the robot brain runs the Ubuntu system.
As shown in fig. 4, in particular, radar module 6 is disposed on chassis 3 and is electrically connected to edge computing device 8-1.
In the actual application process, the application of the radar module 6 is mainly embodied in the aspect of path planning of the intelligent robot, specifically, the radar module 6 collects environmental information and transmits the environmental information to the edge computing device 8-1, and the edge computing device 8-1 is controlled based on a ubuntu16.04 operating system.
Specifically, the drive control module 9 is electrically connected with the embedded computing device 8-2, and it is used to control the operation of the motor 4.
Further, the driving control module 9 is composed of a DRV8701 and a high-power MOS tube, the MOS tube can bear 90A current at the highest, and can bear current in the running process of the motor completely, the motor control chip DRV8701 is small in size and strong in driving capability, and the chip can drive a 24V bidirectional brush direct current motor. And the chip is used for simplifying a motor control mode, the speed and the steering direction of the motor can be controlled by controlling two pins PH/EN, wherein the PH pin controls the motor to steer, 1 is used for forward rotation and 0 is used for reverse rotation, the EN pin controls the speed of the motor through the duty (duty ratio) of a PWM signal, the duty is large, the motor speed is high, the duty is small, and the motor speed is low. One DRV8701 can control one motor, so the present application uses 4 DRV8701 to control the rotation of 4 motors 4 to control the track. Because this application uses the track mode to advance, so the mode that leads to the application turns to the structure and uses the differential mode to turn to, and the differential is turned to through the track rotation difference on control both sides, and this kind turns to the structure turning radius little, can adapt to various places, handles the requirement of most obstacle.
In addition, based on the above-described structure, a description will be made regarding power supply of the intelligent robot. In the practical application process, the 24V rechargeable power supply battery is used, the charging interface is led out, the robot can conveniently and automatically home to charge, and the high-capacity battery is adopted, so that the working requirement of the robot is guaranteed. The 24V battery directly supplies power to the motor 4, and the power supply is reduced to 5V and 3.3V through the 5V and 3.3V direct current voltage regulating unit to supply power to various computing devices and sensor devices. And the 5V interface is led out to be convenient to expand, and 5V also supplies power for a steering engine of the mechanical arm 7.
The type of the power supply module 1, optionally, XL4016 of the Loongson company is used, the DC chip supports 8V-36V power supply input, is completely suitable for 24V power supply of the utility model, the output voltage supports 1.25V-32V random adjustment, the maximum output current 12A can reach 93 percent, and the maximum conversion efficiency can be improved. And a reliability module for input overvoltage protection, overcurrent protection, overtemperature protection, short-circuit protection and the like is integrated in the chip. The power module 1 of the application is formed by the chip, 5V and 3.3V voltage conversion is carried out, and through testing, the power module 1 can completely bear the load of the whole system and can provide stable output voltage.
In summary, the present application provides an intelligent plant monitoring robot, wherein, binocular camera 5, DTU unit 10 and sensor module 11 are connected to edge computing device 8-1 through the USB cable, radar module 6 links to each other with edge computing device 8-1 through the USB cable, and the USB cable realizes power supply and data transmission. The edge computing device 8-1 is connected to the display unit 2 via an HDMI line, and the display unit 2 is configured to display an operating system interface and various system control parameters. The drive control module 9 controls through the embedded computing device 8-2 with lower calculation force, and some sensor data needing to be processed in real time need to be processed through the embedded computing device 8-2 and then sent to the edge computing device 8-1, for example, IMU data; powering various devices using the power module 1; the radar module 6 employs a lidar.
The power supply module 1 supplies power to the motor 4 through a 24V direct current switch, and supplies power to the edge computing device 8-1, the embedded computing device 8-2 and the sensor module 11 after the power supply is reduced to 5V and 3.3V through a direct current voltage regulating unit. The central control module 8 includes an edge computing device 8-1 that handles high latency, high computational power tasks and an embedded computing device 8-2 that handles low computational power, real-time control tasks. The motor 4 drives a driving control module 9 consisting of a DRV8701 and a high-power MOS tube, and drives four direct-current motors 4, and the DRV8701 has small volume and strong driving capability. The camera module 5 consists of a left camera and a right camera of a binocular camera, and both cameras adopt USB cameras with 130W pixels; the DTU unit 10 uses a "cloud with person" data transmission terminal to transmit and receive data.
Based on the above, the intelligent plant monitoring robot provided by the application can transmit data to the DTU unit 10 through the serial port, and then the DTU unit 10 transmits information to the data center through the 4G network, so that workers do not need to frequently enter a workshop to acquire information, and the times of staff disinfection are effectively reduced; meanwhile, a plurality of environmental parameters in the plant growth process can be collected by using the additionally arranged temperature, humidity, illumination sensors and the like, the change condition of the surrounding environment in the plant growth process is recorded, and a reference is provided for a user to monitor the plant growth state and make a next decision; finally, the intelligent plant monitoring robot uses a two-dimension code area dividing technology, the indoor positioning precision can reach 0.5m, the problem of positioning in a trolley room is solved, the defect that GPS (global positioning system) positioning precision is insufficient in the room is effectively avoided, functional information of the robot is added on the two-dimension code, the robot can acquire the current area function by scanning the two-dimension code, and the robot is convenient to interact with the environment.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the utility model referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or equivalents thereof is possible without departing from the spirit of the utility model. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.

Claims (8)

1. Intelligent plant monitoring robot, its characterized in that includes:
the mobile module comprises a chassis (3), and at least one driving wheel is arranged on two sides of the chassis (3);
the information acquisition module, the information acquisition module set up in on chassis (3), just the information acquisition module includes: the system comprises a mechanical arm (7) and a monitoring end arranged on the mechanical arm (7); the mechanical arm (7) is used for driving the monitoring end to move towards the plant to be monitored;
the central control module (8) is arranged on the chassis (3) and is electrically connected with the information acquisition module; the central control module (8) comprises an information processing unit and a drive control unit;
the information processing unit is used for receiving plant information acquired by the monitoring end; the driving control unit is provided with a plurality of control driving ends and is used for controlling the driving wheels and the mechanical arm (7) to act.
2. An intelligent plant monitoring robot according to claim 1, characterized in that the central control module (8) comprises: an edge computing device (8-1) and an embedded computing device (8-2), and the two devices cooperate to form the information processing unit and the drive control unit; the edge computing device (8-1) and the embedded computing device (8-2) communicate information with the data center via a DTU unit (10).
3. The intelligent plant monitoring robot of claim 1, wherein the mobile module further comprises:
the bearing wheels are rotatably arranged on two sides of the chassis (3) along the axis of the bearing wheels;
the crawler belt is arranged outside the bearing wheel and the driving wheel.
4. The intelligent plant monitoring robot of claim 2, wherein the information acquisition module further comprises:
the camera module (5), the said camera module (5) is binocular camera, and it forms the said monitoring end; the binocular camera comprises a left camera and a right camera, and the left camera and the right camera are USB cameras with 130W pixels.
5. An intelligent plant monitoring robot according to claim 1 or 2, further comprising a sensor module (11), the sensor module (11) being arranged on the chassis (3) by means of a rotary steering engine and being electrically connected to the central control module (8); the sensor module (11) is used for identifying road conditions.
6. The intelligent plant monitoring robot according to claim 2, further comprising a display module (2), the display module (2) being arranged on the chassis (3) and being connected to the edge computing device (8-1) via an HDMI line; the display module (2) is used for displaying parameters of an operating system interface and system control.
7. An intelligent plant monitoring robot according to claim 4, further comprising a radar module (6), the radar module (6) being arranged on the chassis (3) and being electrically connected with the edge computing device (8-1).
8. An intelligent plant monitoring robot according to claim 2, further comprising a drive control module (9), the drive control module (9) being electrically connected to the embedded computing device (8-2) and being adapted to control the operation of the motor (4).
CN202223443125.0U 2022-12-22 2022-12-22 Intelligent plant monitoring robot Active CN219161992U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202223443125.0U CN219161992U (en) 2022-12-22 2022-12-22 Intelligent plant monitoring robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202223443125.0U CN219161992U (en) 2022-12-22 2022-12-22 Intelligent plant monitoring robot

Publications (1)

Publication Number Publication Date
CN219161992U true CN219161992U (en) 2023-06-09

Family

ID=86638191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202223443125.0U Active CN219161992U (en) 2022-12-22 2022-12-22 Intelligent plant monitoring robot

Country Status (1)

Country Link
CN (1) CN219161992U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116539613A (en) * 2023-07-07 2023-08-04 佳木斯大学 Mobile agricultural informatization online detection equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116539613A (en) * 2023-07-07 2023-08-04 佳木斯大学 Mobile agricultural informatization online detection equipment
CN116539613B (en) * 2023-07-07 2023-10-13 佳木斯大学 Mobile agricultural informatization online detection equipment

Similar Documents

Publication Publication Date Title
US11892855B2 (en) Robot with perception capability of livestock and poultry information and mapping approach based on autonomous navigation
CN207139822U (en) Data center's crusing robot
CN101373380B (en) Humanoid robot control system and robot controlling method
CN208914092U (en) A kind of intelligent garbage sorting machine people
CN105128032B (en) Snake-shaped robot with nuclear equipment pipe detection function
CN219161992U (en) Intelligent plant monitoring robot
CN105150203A (en) Method for detecting internal environment of nuclear equipment pipeline by snake-like robot
CN102707675A (en) Swarm-robot controller, swarm-robot control method and controller terminal
CN108132670A (en) Multifunctional inspecting robot and method of work based on distributed AC servo system
CN113848208B (en) Plant phenotype platform and control system thereof
Chen et al. Design and implementation of an artificial intelligence of things-based autonomous mobile robot system for pitaya harvesting
CN113084776B (en) Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion
CN114779692A (en) Linear sliding table type weeding robot and control method thereof
CN212683969U (en) Orchard multi-robot physical model
Wang et al. Smart agricultural in-field service robot: from toy to tool
Zhang et al. An automatic control system for ratbot navigation
CN113558031A (en) Intelligent targeting pesticide spraying system
CN114967495A (en) Orchard virtual simulation inspection system and method based on Internet of things cloud control platform
Monsalve et al. Development of agricultural robot platform with virtual laboratory capabilities
Tian et al. Smart and autonomous farm field scouting service robot as an edge device under $1000: Challenges and opportunities
CN205870544U (en) Interactive mechanical arm control system based on kinect
Gopikrishnan et al. Artificial Intelligent Former: A Chatbot-Based Smart Agriculture System
CN215736542U (en) Intelligent targeting pesticide spraying system
Patil et al. AgriDoc: ROS integrated agricultural robot
TWI806721B (en) Autonomous Mobile Artificial Intelligence Mushroom Cultivation Monitoring System and Method

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant