CN111290403B - Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle - Google Patents

Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle Download PDF

Info

Publication number
CN111290403B
CN111290403B CN202010209811.8A CN202010209811A CN111290403B CN 111290403 B CN111290403 B CN 111290403B CN 202010209811 A CN202010209811 A CN 202010209811A CN 111290403 B CN111290403 B CN 111290403B
Authority
CN
China
Prior art keywords
map
vehicle
automatic guided
rgb
environment information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010209811.8A
Other languages
Chinese (zh)
Other versions
CN111290403A (en
Inventor
董朝轶
母英泽
陈晓艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia University of Technology
Original Assignee
Inner Mongolia University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia University of Technology filed Critical Inner Mongolia University of Technology
Priority to CN202010209811.8A priority Critical patent/CN111290403B/en
Publication of CN111290403A publication Critical patent/CN111290403A/en
Application granted granted Critical
Publication of CN111290403B publication Critical patent/CN111290403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a transportation method of an automatic guided transporting vehicle and the automatic guided transporting vehicle, the transportation method of the automatic guided transporting vehicle comprises the following steps: acquiring environment information of the automatic guided transporting vehicle, and generating a map based on the environment information; receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map; and sending a transportation instruction for controlling the automatic transportation guiding transportation vehicle according to the communication instruction. Different maps are generated based on different environmental information to guide a user, so that the user can conveniently trigger the automatic guided transport vehicle to control the transport, the problem that human resources are lacking, such as library book transport work, and the energy and physical strength of a carrier are wasted is solved, and the transport efficiency is improved.

Description

Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle
Technical Field
The embodiment of the invention relates to the technical field of control transportation, in particular to a transportation method of an automatic guided transportation vehicle and the automatic guided transportation vehicle.
Background
With the high-speed continuous development of the production technology level in industries such as industry, manufacturing industry and the like, the inadaptation degree of the low efficiency of a logistics transportation system and the high efficiency of a production system is more and more obvious, and the level and the production process of the logistics transportation process in the current industry still have great gaps. In the production process of products, high cost and wasted human resources caused by transportation, carrying and the like are important problems to be solved by various enterprises. Automated guided vehicles (Automated Guided Vehicle, AGVs for short) have thus emerged, with AGVs being the most basic and critical ring in the automation of whole logistics and production. The automatic guiding transport vehicle is used in the logistics system, so that not only can the waste of human resources be reduced, but also the degree of automation, the efficiency and the flexibility of the logistics system can be improved.
At present, AGV has been put into more and more fields and industries, and has become an important means for solving the waste of human resources and improving the production efficiency. The AGV trolley is found to be used not only in enterprises and factories, but also in places where the AGV trolley is used in our campuses, such as library book carrying work which is most hard for staff, and a library manager can consume a great deal of effort and physical strength of the manager when carrying a great deal of books every day.
Disclosure of Invention
The embodiment of the invention aims to provide a transportation method of an automatic guided transportation vehicle and the automatic guided transportation vehicle, which are used for solving the problems existing in the prior art.
To achieve the above object, in a first aspect, an embodiment of the present invention provides a transportation method for carrying an automatic guided vehicle, the method including:
acquiring environment information of the automatic guided transporting vehicle, and generating a map based on the environment information;
receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
and sending a transportation instruction for controlling the automatic transportation guiding transportation vehicle according to the communication instruction.
Optionally, the obtaining the environmental information of the automatic guided vehicle and generating the map based on the environmental information includes:
acquiring environment information of the automatic guided vehicle;
preprocessing an image corresponding to the environment information, and extracting characteristic points;
judging whether the characteristic points are larger than or equal to a preset threshold value or not;
if the characteristic points are larger than or equal to a preset threshold value, a first RGB-D data frame is obtained;
and constructing a 2D grid map according to the first RGB-D data frame.
Optionally, after determining whether the feature point is greater than or equal to a preset threshold, the method further includes:
if the characteristic points are smaller than a preset threshold value, a second RGB-D data frame is acquired;
and constructing a 3D dense point cloud map according to the second RGB-D data frame.
Optionally, constructing a 2D grid map according to the first RGB-D data frame includes:
and constructing a 2D grid map through a gamma ping-slam algorithm according to the first RGB-D data frame.
Optionally, constructing a 3D dense point cloud map according to the second RGB-D data frame includes:
and constructing a 3D dense point cloud map through an RGB-D SLAM algorithm according to the first RGB-D data frame.
In a second aspect, an embodiment of the present invention further provides a master control system for handling an automatic guided vehicle, including:
the information acquisition module is used for acquiring the environment information of the automatic guided transporting vehicle and generating a map based on the environment information;
the instruction receiving module is used for receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
and the instruction sending module is used for sending a transportation instruction for controlling the automatic transportation guiding transportation vehicle according to the communication instruction.
In a third aspect, the embodiment of the invention also provides an automatic guided transporting vehicle, which comprises the main control system and a bottom layer control unit;
the main control system is used for acquiring the environment information of the automatic guided transporting vehicle and generating a map based on the environment information;
the main control system is also used for receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
the bottom layer control unit is used for receiving the instruction of the main control system and controlling the movement of the automatic conveying guide transport vehicle according to the instruction.
Optionally, the automatic guided vehicle further includes: an environmental information acquisition device;
the environment information acquisition device is used for acquiring the environment information of the automatic guided transporting vehicle and sending the environment information to the main control system.
Optionally, the environmental information obtaining device is a depth camera.
According to the transportation method of the automatic guided transporting vehicle and the automatic guided transporting vehicle, which are provided by the embodiment of the invention, different maps are generated for users to guide based on different environmental information, so that the users can conveniently trigger the automatic guided transporting vehicle to control the automatic guided transporting vehicle, the problem that human resources are lack, such as library book transportation work, and the energy and physical strength of a carrier are wasted is solved, and the transportation efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It is apparent that the drawings in the following description are only exemplary and that those skilled in the art will recognize that
Other drawings may be made to the present invention without the inventive effort.
FIG. 1 is a flow chart of a method for transporting an automated guided vehicle according to an embodiment of the present invention;
FIG. 2 is a software schematic of an AGV cart according to one embodiment of the present invention;
FIG. 3 is a schematic diagram of a library plan view according to one embodiment of the present invention;
FIG. 4 is a schematic diagram of a transporting device for transporting an automated guided vehicle according to an embodiment of the present invention;
FIG. 5 is a schematic perspective view of an AGV cart according to one embodiment of the present invention;
FIG. 6 is a hardware schematic of an AGV cart according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
To further illustrate the technical solution of the present application, the embodiment is illustrated by an AGV in a library. The library manager can consume a great deal of effort and physical strength of the manager when carrying a great deal of books every day, the AGV mobile platform based on the library book carrying work is invented for solving the problem, the manager can determine the position of the trolley according to the vision camera and map information created by a SLAM algorithm, the AGV trolley can automatically plan a path to reach a corresponding site for stopping through remote wifi, the library manager only needs to wait at the site, and when the AGV trolley reaches the site, the manager only needs to put books, so that the work of carrying books is replaced by the AGV trolley, and the efficiency is improved and the time is saved. Meanwhile, book borrowing personnel can also put borrowed books in a transport basket on a school bus, and a designated site is selected to carry out book returning work by utilizing display screen operation on the school bus.
Fig. 1 is a flow chart of a transportation method for carrying an automatic guided vehicle according to an embodiment of the invention, as shown in fig. 1, the method includes the following steps:
101. acquiring environment information of the automatic guided transporting vehicle, and generating a map based on the environment information;
the environmental information of the automatic guided vehicle in this embodiment may be understood as the environment where the automatic guided vehicle is currently located, i.e. a complex scene, such as a library borrowing area, a library reading area, etc., or a simple scene; simple scenes such as library lobbies, library service halls, etc.
102. Receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
the user and transport vechicle communication can be through entity button control, also can carry out remote control transmission through control terminal, like bluetooth, wifi etc. this embodiment is not aligned and is limited.
103. And sending a transportation instruction for controlling the automatic transportation guiding transportation vehicle according to the communication instruction.
According to the method, different maps are generated based on different environmental information to guide a user, so that the user can conveniently trigger the control of the automatic guided transport vehicle, the problem that human resources are lack, such as library book transport work, and the energy and physical strength of a carrier are wasted is solved, and the transport efficiency is improved; the two maps are mutually switched to carry out the map construction mode to carry out the positioning and the map construction of the AGV body, so that the real-time performance is ensured, and meanwhile, the visual information with rich unknown space is provided for readers and staff.
Currently, the navigation modes of an automatic guided vehicle (Automated Guided Vehicle, abbreviated as AGV) mainly include: magnetic navigation, inertial navigation, laser radar navigation, etc., although the technology is mature, each navigation mode has disadvantages. The magnetic navigation utilizes magnetic nails or magnetic strips to conduct automatic guiding, and a large number of magnetic strips and magnetic nails are required to be paved at specified positions before use, so that the problems of high track cost, running only on fixed tracks, inflexibility and the like are caused; the inertial navigation device (Inertial Measurement Unit, IMU for short) can not accurately position the remote navigation, and the wheel type odometer is easy to generate errors; although the laser radar navigation device has good map construction effect, the laser radar navigation device is only limited to the study of a two-dimensional plane, and a more complex three-dimensional map cannot be created. Compared with the common navigation mode, the visual navigation is used as a novel navigation mode, and has the advantages of being capable of acquiring depth information, low in price, rich in visual information and the like. The visual sensor mainly comprises a monocular camera, a binocular camera, a depth camera and the like, and compared with the monocular vision odometer and the binocular vision odometer, the depth visual sensor can directly acquire color and depth information of the environment, so that the instantaneity and the reliability of a timely positioning and map building (Simultaneous Localization and Mapping, SLAM) system can be greatly improved.
Fig. 2 is a software schematic diagram of an AGV cart according to an embodiment of the present invention, where in step 101, the obtaining environmental information of the automatic guided vehicle and generating a map based on the environmental information includes:
acquiring environment information of the automatic guided vehicle;
preprocessing an image corresponding to the environment information, and extracting characteristic points;
judging whether the characteristic points are larger than or equal to a preset threshold value or not;
if the characteristic points are larger than or equal to a preset threshold value, a first RGB-D data frame is obtained;
and constructing a 2D grid map according to the first RGB-D data frame.
In another possible implementation form of the present invention,
after judging whether the feature point is greater than or equal to a preset threshold, the method further comprises:
if the characteristic points are smaller than a preset threshold value, a second RGB-D data frame is acquired;
and constructing a 3D dense point cloud map according to the second RGB-D data frame.
In the map construction process, constructing a 2D grid map according to the first RGB-D data frame includes:
and constructing a 2D grid map through a gamma ping-slam algorithm according to the first RGB-D data frame.
Constructing a 3D dense point cloud map from the second RGB-D data frame, comprising:
and constructing a 3D dense point cloud map through an RGB-D SLAM algorithm according to the first RGB-D data frame.
In the above embodiment, corresponding to fig. 2, the first RGB-D data frame may be understood as an RGB-D data frame with feature points greater than or equal to a given threshold after feature point extraction; the second RGB-D data frame may be understood as an RGB-D data frame having feature points less than a given threshold after feature point extraction.
The traditional depth camera mapping mode is used for selecting and establishing a 3D point cloud map, the 3D point cloud map can be used for visually reconstructing a real environment, three-dimensional information of the real scene can be restored, richer environment information can be constructed relative to the 2D mapping mode, library administrators and readers can know the environment information of an AGV trolley more accurately, and remote control and operation are facilitated. However, the 3D point cloud map occupies too large space and has too much redundant information, which results in poor real-time performance when facing more complex field environments, and cannot realize real-time navigation, thereby resulting in subsequent navigation errors and accidents. In contrast, the 2D visual map is relatively low in computational complexity, and is convenient for realizing functions such as real-time positioning, navigation and obstacle avoidance in a complex environment, but the 2D map cannot completely reflect environment information of the AGV trolley, so that monitoring and operation of library administrators are inconvenient. The invention adopts two algorithm modes of a mapping SLAM algorithm and an RGB-D SLAM algorithm to different environments to create a 2D grid map and a 3D point cloud dense map, and the two maps are mutually switched to create a map mode.
In the embodiment of the invention, firstly, the depth camera extracts environmental information, the environmental information is transmitted to the main control system through USB3.0, TX2 carries out image preprocessing and FAST feature point extraction on a data frame, when the extracted feature point is higher than a given threshold value, the image information detected by the depth camera is relatively complex, the AGV trolley is reflected to be possibly located in a relatively complex reality environment such as a library self-learning room, a reading area and the like, a 2D map mode is switched at the moment, a depth image is acquired through RGB-D data frames, external points, invalid points are removed and an image effective area is screened on the depth image to obtain a depth image { u, v } to be processed, and a spatial point cloud { x, y, z } corresponding to each pixel point of the depth image can be obtained through the depth image { u, v } and camera model parameters obtained through camera calibration, and the spatial point cloud { x, y, z } is projected to an xz plane, so that a sensor data structure required by a gamma-slam algorithm can be constructed, and a two-dimensional map can be simultaneously provided for a map of the position and a position of a station can be stored by a control unit, and a position of a corresponding carrier can be controlled by the AGV. When the extracted characteristic points are lower than a given threshold value, the image information detected by the depth camera is simpler, the condition that the AGV trolley is possibly in a clear reality environment such as a library hall and a service area is reflected, at the moment, the AGV trolley is switched to a 3D map mode, firstly, characteristic matching is carried out on the characteristic points extracted from adjacent RGB-D image data frames, the characteristic point matching is carried out by adopting a quick approximate nearest neighbor (FLANN) algorithm, compared with a common violent matching algorithm, the calculation amount of the FLANN matching algorithm is reduced, the AGV trolley is more suitable for the condition that the number of matching points is extremely large, and the requirement on the real-time property of the AGV trolley is met. And detecting the matching points, and removing the mismatching and the edge characteristic points by a threshold method, so that effective data are reserved. And simultaneously searching for the matching points in the matching point pairs corresponding to the depth information in the depth image, and combining to form three-dimensional coordinate matching point pairs, so that a group of three-dimensional coordinate matching point pair sets of two adjacent image data frames can be obtained. Then according to the obtained three-dimensional coordinate matching point pair set between the data frames, a random sampling consensus algorithm (RANSAC) method is used for estimating motion transformation between two adjacent data frames, an Iterative Closest Point (ICP) algorithm is used for optimizing the motion transformation, and the above is an SLAM (Visual Odometry, short VO) part in a 3D mode. And finally, generating point cloud information according to the three-dimensional space information of the characteristic points while carrying out a positioning algorithm, adding the obtained camera pose and the point cloud to obtain global point cloud, and carrying out filtering treatment on the point cloud image to obtain a better effect.
The software design part of the embodiment of the invention is mainly divided into a chassis control program (written into an STM32 development board) and a main control system program (written into a Jetson TX2 development board), wherein the main control system program comprises a visual positioning program, a rear-end optimization program, a construction map, a path planning program, a wireless communication program and the like, and is mainly written in C++ language through a Linux+ ROS (Robot Operating System) combined system. The bottom control program is mainly written by using a C language, is mainly used for collecting ultrasonic sensor data, detecting obstacles, controlling a motor, collecting displacement information of an AGV chassis through a photoelectric encoder as odometer input of a main control system, and receiving a control command of the main control system. After the data processing is completed and the positioning algorithm is completed, the main control system can send the current state of the AGV to a local area network appointed by a user through a wifi communication program, and other computers in the local area network can obtain the current state information of the AGV by checking the ip address of the AGV.
According to the invention, two algorithm modes of a mapping SLAM algorithm and an RGB-D SLAM algorithm are adopted for different environments, a 2D grid map and a 3D point cloud dense map are created, and the two maps are mutually switched to carry out map construction mode for positioning and map construction of an AGV body. Therefore, the method and the device can provide the visual information with rich unknown space for readers and staff while ensuring the real-time performance.
Fig. 4 is a schematic structural diagram of a main control system of an automatic guided vehicle according to an embodiment of the present invention, where, as shown in fig. 4, the main control system includes:
an information acquisition module 41 for acquiring environmental information of the automated guided vehicle and generating a map based on the environmental information;
the instruction receiving module 42 is configured to receive a communication instruction of a user, where the communication instruction is triggered after the user obtains the map;
the instruction sending module 43 is configured to send a transportation instruction for controlling the automatic guided transportation vehicle according to the communication instruction.
It can be understood that the main control system of the automatic guided vehicle and the transportation method of the automatic guided vehicle in this embodiment are in one-to-one correspondence, and the detailed description of the system is omitted in this embodiment.
Through the master control system, as shown in fig. 3, which is an exemplary library plan schematic diagram, when the extracted feature points are higher than a given threshold value, it is indicated that the image information detected by the depth camera is more complex, reflecting that the AGV trolley may be in a complex reality environment such as a library study room, a reading area, etc., and at this time, the system is switched to a 2D map mode, so that the manager can control the trolley to move to a designated carrying place and a corresponding book storage site. When the extracted feature points are lower than a given threshold value, the image information detected by the depth camera is relatively simple, and the AGV is reflected to be possibly in a relatively open reality environment such as a library hall and a service area, and is switched to a 3D map mode. The mapping mode is switched to a 2D grid map when facing complex scenes, such as library borrowing areas, library reading areas and the like, and is switched to a 3D point cloud dense map when facing simple scenes, such as library halls, library service halls and the like. Aiming at different real environments, the two-dimensional map and the 3-dimensional map are switched mutually, so that the real-time performance is ensured, and meanwhile, the visual information with rich unknown space is provided for readers and staff.
FIG. 5 is a schematic perspective view of an AGV according to an embodiment of the present invention, and FIG. 6 is a schematic hardware view of an AGV according to an embodiment of the present invention; the automated guided vehicle of fig. 5, comprising: the main control system and the bottom layer control unit are as described above;
the main control system is used for acquiring the environment information of the automatic guided transporting vehicle and generating a map based on the environment information;
the main control system is also used for receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
the bottom layer control unit is used for receiving the instruction of the main control system and controlling the movement of the automatic conveying guide transport vehicle according to the instruction.
According to the automatic guided transport vehicle, different maps are generated based on different environmental information to guide a user, so that the user can conveniently trigger the automatic guided transport vehicle to control the automatic guided transport vehicle, the problem that manpower resources are lack, such as library book carrying work, and energy and physical strength of a carrier are wasted is solved, and the transport efficiency is improved.
The carrying automatic guiding transport vehicle further comprises: an environmental information acquisition device;
the environment information acquisition device is used for acquiring the environment information of the automatic guided transporting vehicle and sending the environment information to the main control system.
The environmental information acquisition device is a depth camera.
The invention aims to realize autonomous positioning and synchronous drawing establishment of an AGV trolley under different real environments through a depth vision sensor, and can be remotely controlled by WiFi, so that the labor capacity of library managers is reduced, and the book carrying work which is time-consuming and labor-consuming is equivalently completed. The invention adopts the kinect2 depth camera, can simultaneously extract the RGB image and the depth image of the real scene, and is convenient for realizing the construction of the three-dimensional map.
As shown in fig. 5, the agv trolley mainly comprises a vehicle body 2, a transport basket 7, a chassis, a bottom layer control unit, a vision sensing module 1, an ultrasonic sensing module 10, wheels 3, a gear motor, a battery, a heat dissipation module 4, a motor driving module, a power management module, a main control system, a wireless communication module, a voltage reduction module, an audible and visual alarm module 9, a display screen module 8, an anti-collision edge strip 5 and the like. Wherein, in order to prevent the car body from being placed in the center of the car body by tilting the battery, a touch screen and a button (including a switch, a sudden stop and the like) are placed at the front part of the car body, an ultrasonic sensing module is positioned at the front and rear parts of the car body, and a visual sensing module is positioned on a bracket on the car body at the front part of the car body; the audible and visual alarm module is located four angles of AGV dolly respectively, and crashproof strake is located automobile body lower part a week respectively.
As shown in fig. 6, the master control system mainly realizes the operations of huge calculation amount of visual positioning algorithm, track optimization, map construction and the like, the wireless communication module is connected with the master control system, the manager can remotely give an instruction to the trolley through WIFI, the master control system transmits a control command to the bottom layer control unit, and the bottom layer control unit controls the motor and drives the wheels through the motor driving module, so that the movement of the trolley is controlled. Under the same local area network, technicians can remotely monitor the real-time state of the AGV trolley by inputting the IP address of the main control on a computer. The remotely viewable state includes: an environment map of the environment in which the trolley is located, the position of the trolley in the map, the travelling route of the trolley, the current movement speed of the trolley and the like. The visual sensing module is connected with the main control system, and provides a color image (RGB image) and a depth image for the main control system through USB3.O, so that the main control system can perform visual algorithm calculation. The bottom layer control unit and the main control system are connected by an RS 232-USB serial port module for data transmission. The bottom control unit is located inside the car body, the power supply is connected through the power supply management module, four ZD510 brushless motor drivers are controlled through a group of I/O ports to achieve control over four motors, meanwhile, the bottom control unit is connected with the ultrasonic sensing module and the audible and visual alarm module, the ultrasonic sensing module adopts HC-SRO4, the module can provide a non-contact distance sensing function in a range of 2cm-400cm, the module comprises an ultrasonic transceiver and a control circuit, when the ultrasonic sensing module detects that an obstacle exists in a range in front of the car, the buzzer and the LED lamp of the audible and visual alarm module are stopped and triggered, pedestrians are reminded to avoid and manage and control by an administrator in time, and when a road is relieved, the car automatically resumes transportation work and goes to a corresponding station. The main control system of the trolley uses a Jetson TX2 development board which is proposed by NVIDIA company, and the Jetson TX2 has strong processing and computing capabilities although adopting energy-saving and small size, and is suitable for a modern intelligent tip device. Jetson TX2 has two CPUs in total and a GPU with 256 CUDA cores, the memory reaches 8GB, the memory reaches 32GB, wifi and Bluetooth are supported, and the development board CPU part consists of two ARM v 8-64-bit CPU clusters which are connected by a high-performance coherent interconnection structure. To improve single-threaded performance, the denver 2 (dual core) CPU cluster is optimized; the second CPU cluster is an ARM Cortex-A57QuadCore, which is more suitable for multi-threaded applications. The AGV trolley bottom layer control unit selects an STM32F103C8T6 microcontroller as a chassis driving main control chip, the STM32F103C8T6 is a 32-bit microcontroller based on ARM Cortex-M kernel STM32 series, the program memory capacity is 64KB, the voltage is 2V-3.6V, a bottom layer control algorithm can be realized, and the AGV trolley bottom layer control unit is communicated with TX2 through a USB serial port module converted from RS 232. The battery selects 24V 40A 80AH rechargeable lithium battery, and the battery is connected with the power management module to convert voltage and current and respectively supplies power to the motor, the vision sensing module, the bottom layer control unit and the main control system.
The functional modules in the embodiments of the present invention may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer apparatus (which may be a personal computer, a server, or a network apparatus, etc.) or a smart terminal device or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the foregoing embodiments of the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules is merely a logical function division, and there may be additional divisions of actual implementation, e.g., multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not performed.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.

Claims (8)

1. A method of transporting an automated guided vehicle, the method comprising:
acquiring environment information of the automatic guided transporting vehicle, and generating a map based on the environment information;
receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
according to the communication instruction, a transportation instruction for controlling the automatic transportation guiding transportation vehicle is sent;
the obtaining the environmental information of the automatic guided vehicle and generating a map based on the environmental information includes:
acquiring environment information of the automatic guided vehicle;
preprocessing an image corresponding to the environment information, and extracting characteristic points;
judging whether the characteristic points are larger than or equal to a preset threshold value or not;
if the characteristic points are larger than or equal to a preset threshold value, a first RGB-D data frame is obtained;
and constructing a 2D grid map according to the first RGB-D data frame.
2. The method of claim 1, wherein,
after judging whether the feature point is greater than or equal to a preset threshold, the method further comprises:
if the characteristic points are smaller than a preset threshold value, a second RGB-D data frame is acquired;
and constructing a 3D dense point cloud map according to the second RGB-D data frame.
3. The method of claim 1, wherein constructing a 2D raster map from the first RGB-D data frame comprises:
and constructing a 2D grid map through a mapping SLAM algorithm according to the first RGB-D data frame.
4. The method of claim 2, wherein constructing a 3D dense point cloud map from the second RGB-D data frame comprises:
and constructing a 3D dense point cloud map through an RGB-D SLAM algorithm according to the second RGB-D data frame.
5. A master control system for a handling automated guided vehicle, comprising:
the information acquisition module is used for acquiring the environment information of the automatic guided transporting vehicle and generating a map based on the environment information;
the instruction receiving module is used for receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
the instruction sending module is used for sending a transportation instruction for controlling the automatic transportation guiding transportation vehicle according to the communication instruction;
the information acquisition module is also used for acquiring the environment information of the automatic guided transporting vehicle;
preprocessing an image corresponding to the environment information, and extracting characteristic points;
judging whether the characteristic points are larger than or equal to a preset threshold value or not;
if the characteristic points are larger than or equal to a preset threshold value, a first RGB-D data frame is obtained;
and constructing a 2D grid map according to the first RGB-D data frame.
6. A handling automated guided vehicle, comprising: an underlying control unit and the master control system of claim 5;
the main control system is used for acquiring the environment information of the automatic guided transporting vehicle and generating a map based on the environment information;
the main control system is also used for receiving a communication instruction of a user, wherein the communication instruction is triggered after the user acquires the map;
the bottom layer control unit is used for receiving the instruction of the main control system and controlling the movement of the automatic conveying guide transport vehicle according to the instruction.
7. The automated guided vehicle of claim 6, further comprising: an environmental information acquisition device;
the environment information acquisition device is used for acquiring the environment information of the automatic guided transporting vehicle and sending the environment information to the main control system.
8. The automated guided vehicle of claim 7, wherein the environmental information acquisition device is a depth camera.
CN202010209811.8A 2020-03-23 2020-03-23 Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle Active CN111290403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010209811.8A CN111290403B (en) 2020-03-23 2020-03-23 Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010209811.8A CN111290403B (en) 2020-03-23 2020-03-23 Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle

Publications (2)

Publication Number Publication Date
CN111290403A CN111290403A (en) 2020-06-16
CN111290403B true CN111290403B (en) 2023-05-16

Family

ID=71022091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010209811.8A Active CN111290403B (en) 2020-03-23 2020-03-23 Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle

Country Status (1)

Country Link
CN (1) CN111290403B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112648008B (en) * 2020-12-28 2023-09-29 北京宸控科技有限公司 Hydraulic support carrying method
CN113067386B (en) * 2021-04-07 2023-06-06 山东沐点智能科技有限公司 Wireless charging system and method for explosion-proof inspection robot
CN113219971A (en) * 2021-05-07 2021-08-06 西南交通大学 Multifunctional intelligent logistics trolley based on convolutional neural network and multithreading parallel control
CN113218384B (en) * 2021-05-19 2022-05-06 中国计量大学 Indoor AGV self-adaptive positioning method based on laser SLAM

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094130A (en) * 2015-07-29 2015-11-25 广东省自动化研究所 AGV (Automatic Guided Vehicle) navigation method and device constructed by laser guidance map
CN109341707A (en) * 2018-12-03 2019-02-15 南开大学 Mobile robot three-dimensional map construction method under circumstances not known

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9037396B2 (en) * 2013-05-23 2015-05-19 Irobot Corporation Simultaneous localization and mapping for a mobile robot
CN104596533B (en) * 2015-01-07 2017-08-01 上海交通大学 Automatic guided vehicle and its guidance method based on map match
KR101695557B1 (en) * 2015-07-17 2017-01-24 고려대학교 산학협력단 Automated guided vehicle system based on autonomous mobile technique and a method for controlling the same
CN107515002A (en) * 2016-06-17 2017-12-26 趣之科技(深圳)有限公司 A kind of systems approach and device that the real-time indoor map structure of robot and location navigation are realized based on LiDAR and cloud computing
CN106598052A (en) * 2016-12-14 2017-04-26 南京阿凡达机器人科技有限公司 Robot security inspection method based on environment map and robot thereof
CN107450561A (en) * 2017-09-18 2017-12-08 河南科技学院 The autonomous path planning of mobile robot and obstacle avoidance system and its application method
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN108710376A (en) * 2018-06-15 2018-10-26 哈尔滨工业大学 The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion
CN109883418A (en) * 2019-01-17 2019-06-14 中国科学院遥感与数字地球研究所 A kind of indoor orientation method and device
CN110763225B (en) * 2019-11-13 2023-05-09 内蒙古工业大学 Trolley path navigation method and system and transport vehicle system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094130A (en) * 2015-07-29 2015-11-25 广东省自动化研究所 AGV (Automatic Guided Vehicle) navigation method and device constructed by laser guidance map
CN109341707A (en) * 2018-12-03 2019-02-15 南开大学 Mobile robot three-dimensional map construction method under circumstances not known

Also Published As

Publication number Publication date
CN111290403A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN111290403B (en) Transport method for carrying automatic guided transport vehicle and carrying automatic guided transport vehicle
Moshayedi et al. AGV (automated guided vehicle) robot: Mission and obstacles in design and performance
CN104536445B (en) Mobile navigation method and system
CN114474061B (en) Cloud service-based multi-sensor fusion positioning navigation system and method for robot
WO2018085291A1 (en) Systems and methods for robotic mapping
US20190064828A1 (en) Autonomous yard vehicle system
CN108762255A (en) A kind of indoor intelligent mobile robot and control method
CN107515002A (en) A kind of systems approach and device that the real-time indoor map structure of robot and location navigation are realized based on LiDAR and cloud computing
CN113311821B (en) Drawing and positioning system and method for multi-pendulous pipeline flaw detection mobile robot
CN111459172A (en) Autonomous navigation system of boundary security unmanned patrol car
WO2023036083A1 (en) Sensor data processing method and system, and readable storage medium
CN114167866B (en) Intelligent logistics robot and control method
CN113674355A (en) Target identification and positioning method based on camera and laser radar
Gajjar et al. A comprehensive study on lane detecting autonomous car using computer vision
CN108646759B (en) Intelligent detachable mobile robot system based on stereoscopic vision and control method
WO2024036984A1 (en) Target localization method and related system, and storage medium
CN109389677A (en) Real-time construction method, system, device and the storage medium of house three-dimensional live map
CN115327571A (en) Three-dimensional environment obstacle detection system and method based on planar laser radar
CN112965494B (en) Control system and method for pure electric automatic driving special vehicle in fixed area
Pang et al. A Low-Cost 3D SLAM System Integration of Autonomous Exploration Based on Fast-ICP Enhanced LiDAR-Inertial Odometry
WO2022033089A1 (en) Method and device for determining three-dimensional information of object to undergo detection
Nabbe et al. Opportunistic use of vision to push back the path-planning horizon
Zhao et al. The construction method of the digital operation environment for bridge cranes
Ai et al. Design of an indoor surveying and mapping robot based on SLAM technology
Balasooriya et al. Development of the smart localization techniques for low-power autonomous rover for predetermined environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant