CN112947426A - Cleaning robot motion control system and method based on multi-sensing fusion - Google Patents
Cleaning robot motion control system and method based on multi-sensing fusion Download PDFInfo
- Publication number
- CN112947426A CN112947426A CN202110137126.3A CN202110137126A CN112947426A CN 112947426 A CN112947426 A CN 112947426A CN 202110137126 A CN202110137126 A CN 202110137126A CN 112947426 A CN112947426 A CN 112947426A
- Authority
- CN
- China
- Prior art keywords
- cleaning robot
- module
- processing module
- information
- micro
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 161
- 230000033001 locomotion Effects 0.000 title claims abstract description 55
- 238000000034 method Methods 0.000 title claims abstract description 21
- 230000004927 fusion Effects 0.000 title claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 90
- 230000008447 perception Effects 0.000 claims abstract description 10
- 238000004804 winding Methods 0.000 claims abstract description 6
- 230000001133 acceleration Effects 0.000 claims description 7
- 230000036544 posture Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 230000007613 environmental effect Effects 0.000 claims description 5
- 238000004590 computer program Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 4
- 230000010354 integration Effects 0.000 abstract 1
- 238000005516 engineering process Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 241001629511 Litchi Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000007115 recruitment Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0225—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a cleaning robot motion control system based on multi-sensing fusion, which comprises an environment sensing module, a bottom layer main control module, a micro-processing module, a central processing module and a servo driving module, wherein the environment sensing module is used for sensing the motion of a cleaning robot; the environment perception module realizes the control of the motion attitude of the cleaning robot through the integration of multiple sensing modes of the AI target recognition module, the area array radar module and the laser radar module. The system and the method provided by the invention can be used for realizing flexible control of the cleaning robot, avoid artificial interference on the cleaning operation of the autonomous cleaning robot, greatly reduce the winding risk, improve the operation efficiency of the cleaning robot and increase the reliability and the safety.
Description
Technical Field
The invention relates to the technical field of cleaning robots, in particular to a multi-sensing fusion-based motion control system and method for a cleaning robot.
Background
With the development of artificial intelligence technology, especially the continuous maturity of unmanned technology, the landing commercialization of low-speed unmanned commercial cleaning robots in indoor and outdoor environments has become possible. With the accelerating aging of population and the background of difficult recruitment caused by the factors of low cleaning work efficiency, high work intensity and the like, the commercial cleaning robot automatically completes the large-scene ground full-coverage cleaning task by means of the artificial intelligence technology comprising the SLAM algorithm, the computer vision, the multi-sensor fusion algorithm, the automatic path planning and other technologies, and releases a large amount of cleaners from the repeated and low-efficiency working environment.
At present, common cleaning robots in the market can automatically pass by a fixed or movable obstacle in front by means of sensors such as laser radar, ultrasonic waves and imu through a SLAM algorithm, and some cleaning robots can recognize the distance of the obstacle in front by means of a depth camera. However, the depth camera relies on pure image feature matching, so the effect is very poor under the condition of dark illumination or overexposure, and in addition, if the detected scene is lack of textures, feature extraction and matching are difficult to perform. The ultrasonic wave is measured according to the characteristic that the propagation speed of the ultrasonic wave in the air is known, and the ultrasonic wave is reflected by an obstacle, and the following defects exist: a. the measurement precision of the ultrasonic distance meter is centimeter-level, and the precision is not high; b. the ultrasonic ranging sensor is used for transmitting sound waves and has the sector transmission characteristic of the sound waves, so that when the number of obstacles at the position where the sound waves pass is large, the reflected sound waves are large, the interference is large, and the error is easy to report; c. the ultrasonic wave calculates the actual distance from the transmitting point to the obstacle according to the time difference between transmitting and receiving, so that the ultrasonic ranging has time delay, and a time delay sensor has high risk for the mobile robot.
Therefore, the cleaning robot on the market at present has the problems of various technical defects, but a motion control mode for distinguishing the front obstacle as an object or a pedestrian by the cleaning robot through the fusion of an AI target recognition technology, an area array radar module and a laser radar multi-sensing mode does not exist at present.
Disclosure of Invention
Based on the technical problems in the background art, the invention provides the cleaning robot motion control system based on multi-sensing fusion, which avoids artificial interference on cleaning operation of the autonomous cleaning robot, greatly reduces the winding risk, improves the operation efficiency of the cleaning robot and increases the reliability and the safety.
Cleaning machines people motion control system based on many sensing fuses includes:
the environment sensing module is used for acquiring environment information of the cleaning robot in a working state;
the bottom layer main control module is used for acquiring working parameter information of the cleaning robot in a working state;
the micro-processing module is used for receiving the information sent by the environment sensing module and the bottom layer main control module and transmitting the received information to the central processing module;
the central processing module is used for processing the received information to obtain decision information and transmitting the decision information to the micro-processing module;
and the servo driving module is used for receiving the decision information sent by the micro-processing module and controlling the motion attitude of the cleaning robot.
Preferably, the environmental information in the working state of the cleaning robot includes: whether a person or an object exists in front of the cleaning robot or not, and the distance between the cleaning robot and the person or the object in front; the working parameter information of the cleaning robot in the working state comprises acceleration, angular velocity and pose information of the cleaning robot body.
Preferably, the environment awareness module comprises:
the AI target recognition module is used for judging whether a person is in front of the cleaning robot;
the area array radar module is used for judging the distance between the human and the cleaning robot body;
and the laser radar module is used for judging whether an obstacle exists in front of the cleaning robot or not and judging the distance between the obstacle and the cleaning robot body under the condition that the obstacle exists.
Preferably, the bottom master control module includes:
the inertia measurement unit is used for acquiring the acceleration and the angular speed of the cleaning robot during movement; the odometer unit is used for acquiring pose information of the cleaning robot; the odometer is used as an effective sensor for relative positioning of the mobile robot, and provides real-time pose information for the robot. The pose information includes a position and a pose of the cleaning robot, and further, the position is a travel distance of the cleaning robot, and the pose is a current direction of the cleaning robot.
Preferably, the central processing module processes the received information by using an SLAM algorithm, and the obtained decision information includes a stop instruction, a forward instruction and a bypass instruction; the movement postures of the cleaning robot include a forward movement, a pause movement and a winding movement.
The information about the obstacles acquired by the laser radar module is directly transmitted to the central processing module, the working parameter information of the cleaning robot acquired by the bottom layer main control module is also transmitted to the central processing module through the micro processing module, and the central processing module makes decision information based on the existing SLAM algorithm;
the method comprises the steps that information about pedestrians acquired by an area array laser radar module and an AI target recognition module is transmitted to a micro-processing module, the micro-processing module transmits the information about the pedestrians and working parameter information of the cleaning robot from a bottom layer main control module to a central processing module, the central processing module makes decision information based on an improved SLAM algorithm, and in the improved SLAM algorithm, an area array radar and an AI target recognition control plug-in are added to an original frame, so that the cleaning robot is controlled to pause, detour and advance according to signals recognized by the area array radar and the AI target and multi-sensor fusion of the laser radar.
The invention subdivides the working environment of the cleaning robot, and sets the cleaning robot to execute different motion postures aiming at each working environment:
(1) in the first case: the AI target recognition module in the environment perception module does not recognize the portrait and the laser radar module does not recognize the obstacle, at the moment, no person or obstacle exists in front of the robot, and the central processing module sends a forward instruction, so that the cleaning robot can normally move forward;
(2) in the second case: when the AI target recognition module in the environment perception module recognizes the portrait and the area array radar module judges that the distance between the person and the cleaning robot is greater than a preset value, and the laser radar module recognizes the obstacle and judges that the distance between the person and the cleaning robot is greater than the preset value, the central processing module sends a forward instruction, so that the cleaning robot can normally move forward;
(3) in the third case: an AI target recognition module in the environment perception module recognizes a portrait, and the area array radar module judges that the distance between a person and the cleaning robot is smaller than a preset value, the central processing module sends a stop instruction, the cleaning robot stops moving, when the pedestrian leaves the cleaning robot, the two conditions are divided, wherein the first condition is that the pedestrian leaves the cleaning robot, and the AI target recognition module cannot recognize the person; in the second situation, if the pedestrian leaves, the AI target identification module can also identify the pedestrian, but the distance detected by the area array laser radar at the moment is greater than the preset value; for both cases, the cleaning robot continues to advance again along the original path direction.
(4) In a fourth case: when the AI target recognition module in the environment perception module does not recognize the portrait and the laser radar module recognizes that the obstacle exists, when the distance between the cleaning robot and the obstacle is smaller than a preset value, the central processing module sends a detouring instruction, and then the cleaning robot performs detouring movement.
When the environment perception module simultaneously detects that a person and an obstacle exist in front of the cleaning robot, the robot is judged to pause. Therefore, the motion control system of the cleaning robot fully considers that the priority level of the human is higher than that of the obstacle, the area array radar module is used for judging the distance between the pedestrian and the cleaning robot and can be accurately in millimeter level in real time, and the safety of the human is effectively guaranteed.
The cleaning robot motion control method based on multi-sensing fusion comprises the following steps:
s1: the environment sensing module acquires environment information of the cleaning robot in a working state and sends the environment information to the micro-processing module;
s2: the bottom layer main control module acquires working parameter information of the cleaning robot in a working state and sends the working parameter information to the micro-processing module;
s3: the micro-processing module sends the information obtained in S1 and S2 to the central processing module, and the central processing module processes the received information to obtain decision information and transmits the decision information to the micro-processing module;
s4: and the micro-processing module transmits the decision information to the servo driving module, and the servo driving module drives the cleaning robot to move.
Preferably, when the environment sensing module detects that a person is in front of the cleaning robot, the central processing module sends a stop instruction, and the cleaning robot suspends the movement; when the environment perception module detects that an obstacle is in front of the cleaning robot, the central processing module sends a detouring instruction, and the cleaning robot performs detouring movement to avoid the obstacle.
The invention also provides a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method.
The invention also provides a cleaning robot, which comprises a robot body and a controller, wherein the controller comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and the steps of the method are realized when the processor executes the program.
The cleaning robot is characterized in that an area array laser radar and an AI target recognition lens are arranged on the front end face of the cleaning robot, and a laser radar is arranged on the front side of the bottom face of the cleaning robot.
The servo driving system comprises two driving wheel servo motors which respectively control one driving wheel, so that the differential operation of the two driving wheels of the cleaning robot is controlled, and the turning action of the cleaning robot body is ensured. The micro-processing module sends decision information commands sent by the central processing module to the servo driving module through a 485 communication protocol so as to control the two driving wheels to move in real time, photoelectric encoders installed on the two driving wheels feed acquired data back to the odometer unit in real time and then transmit the data to the micro-processing module, the micro-processing module adjusts the movement of the driving wheels at the next moment according to the speed information fed back, and the information interaction between the servo driving module and the micro-processing module is realized through the transmission of the information.
The invention has the following beneficial effects:
the invention provides a cleaning robot motion control system based on multi-sensing fusion, which comprises an environment sensing module, a bottom layer main control module, a micro-processing module, a central processing module and a servo driving module. The environment perception module realizes multi-sensing mode fusion through the AI target recognition module, the area array radar module and the laser radar module, the central processing module controls the movement of the cleaning robot according to the improved SLAM algorithm, the control system fully considers the priority level of pedestrians, and when the cleaning robot faces the pedestrians and obstacles, different movement postures can be made; the control system is used for the cleaning robot, flexible control can be realized, manual interference in cleaning operation of the self-cleaning robot is avoided, winding risks are greatly reduced, the operation efficiency of the cleaning robot is improved, and reliability and safety are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is an overall schematic diagram of a cleaning robot motion control system based on multi-sensing fusion;
FIG. 2 is a schematic diagram of the principle of three motion control commands of the underlying industrial control system of the present invention;
FIG. 3 is a schematic view of the cleaning robot of the present invention;
fig. 4 is a schematic diagram of an AI target detection neural network of the present invention.
In the figure: the system comprises a 1-area array laser radar, a 2-AI target recognition lens and a 3-laser radar.
Detailed Description
The present invention will be further illustrated with reference to the following specific examples.
Referring to fig. 1-4, the cleaning robot motion control system based on multi-sensing fusion comprises an environment sensing module, a bottom layer main control module, a micro-processing module, a central processing module and a servo driving module.
The environmental perception module is used for acquiring environmental information of the cleaning robot in a working state, and the environmental information of the cleaning robot in the working state comprises: whether a person or an object is in front of the cleaning robot, and the distance between the cleaning robot and the person or the object in front.
The environment sensing module comprises: the AI target recognition module is used for judging whether a person is in front of the cleaning robot; the area array radar module is used for judging the distance between the human and the cleaning robot body; and the laser radar module is used for judging whether an obstacle exists in front of the cleaning robot or not and judging the distance between the obstacle and the cleaning robot body under the condition that the obstacle exists.
The bottom layer main control module is used for acquiring working parameter information of the cleaning robot in a working state; the working parameter information of the cleaning robot in the working state comprises the acceleration, the angular velocity and the movement distance of the cleaning robot body; the bottom layer main control module comprises: the inertia measurement unit is used for acquiring the acceleration and the angular speed of the cleaning robot during movement; and the odometer unit is used for acquiring pose information of the robot, wherein the pose information comprises the position and the posture of the cleaning robot, and further the position is the running distance and the posture of the cleaning robot, namely the current direction of the cleaning robot.
The central processing module is used for processing the received information to obtain decision information and transmitting the decision information to the micro-processing module; the central processing module processes the received information by adopting an SLAM algorithm, and the obtained decision information comprises a stop instruction, a forward instruction and a bypass instruction; the motion postures of the cleaning robot include a forward motion, a pause motion, and a winding motion.
The micro-processing module is used for receiving the information sent by the environment sensing module and the bottom layer main control module and transmitting the received information to the central processing module; and the servo driving module is used for receiving the decision information sent by the micro-processing module and controlling the motion attitude of the cleaning robot.
The micro-processing module sends decision information commands sent by the central processing module to the servo driving module through a 485 communication protocol so as to control the two driving wheels to move in real time, photoelectric encoders installed on the two driving wheels feed acquired data back to the odometer unit in real time and then transmit the data to the micro-processing module, the micro-processing module transmits the information to the central processing module, the central processing module combines all information to obtain decision information of the cleaning robot at the next moment and sends the decision information to the micro-processing module, the micro-processing module adjusts the movement of the driving wheels at the next moment according to the fed-back speed information, and the information interaction of the servo driving module and the micro-processing module is realized through the transmission of the information.
The odometer unit is used as an effective sensor for relative positioning of the cleaning robot, and provides real-time pose information for the cleaning robot. The radian of the wheel rotating within a certain time is detected according to photoelectric encoders arranged on a left driving wheel motor and a right driving wheel motor of the cleaning robot, and data such as the diameter of the wheel are transmitted to an odometer unit, so that the change of the relative pose of the robot is calculated.
The cleaning robot motion control method based on multi-sensing fusion comprises the following steps:
s1: the environment sensing module acquires environment information of the cleaning robot in a working state and sends the environment information to the micro-processing module;
s2: the bottom layer main control module acquires working parameter information of the cleaning robot in a working state and sends the working parameter information to the micro-processing module;
s3: the micro-processing module sends the information obtained in S1 and S2 to the central processing module, and the central processing module processes the received information to obtain decision information and transmits the decision information to the micro-processing module;
s4: the micro-processing module transmits the decision information to the servo driving module, and the servo driving module drives the cleaning robot to move.
When the environment sensing module detects that a person is in front of the cleaning robot, the central processing module sends a stop instruction, and the cleaning robot pauses to move; when the environment sensing module detects that an obstacle exists in front of the cleaning robot, the central processing module sends a bypassing instruction, and the cleaning robot performs bypassing movement to avoid the obstacle.
Fig. 2 further explains the control method of the movement control system of the cleaning robot in the present embodiment:
when the AI target identification module judges that a person is in front and the distance between the pedestrian and the cleaning robot is detected by the area array radar module to be less than 1.6 meters, the central processing module sends a stop instruction;
when the AI target recognition module judges that a pedestrian exists in front of the cleaning robot and the laser radar module judges that an obstacle exists in front of the cleaning robot, and the distances between the pedestrian and the obstacle and the cleaning robot are less than 1.6 meters, the central processing module sends a stop instruction;
when the AI target recognition module judges that no pedestrian exists in front and the laser radar module judges that an obstacle exists in front and the distance between the obstacle and the cleaning robot is less than 1.2 m, the central processing module sends a detouring instruction;
when the AI target recognition module judges that no pedestrian exists in the front and the laser radar module judges that an obstacle exists in the front and the distance between the obstacle and the cleaning robot is greater than 1.2 m, the central processing module sends a forward instruction.
The cleaning robot in the embodiment adopts an industrial personal computer (brand: advanced model: AIC-1U03) as a central processing module and an STM32 microprocessor (brand: Italian semiconductor, model: STM32F407IET6 as a microprocessing module, an area array laser radar (brand: lyric micro can only, model: AL-10) and an AI target identification lens (brand: litchi pie, model: OV2640) are arranged on the front end face of the cleaning robot, and a laser radar (brand: Wesk model: LMS141) is arranged on the front side of the bottom face of the cleaning robot.
The data collected by the motion control system includes: angular velocity (w) and acceleration (a), odometer data (x, y), AI target recognition (manned data defined as 1, unmanned data defined as 0), and area array radar data (d). The microprocessor sends the collected data to the central processing module through a 485 protocol, and an improved SALM algorithm running in the central processing module processes the data in real time. The SLAM algorithm judges different running states of the cleaning robot according to the data, for example, when the AI target identifies that the front is a person, the data is 1, the SLAM algorithm receives the data 1, if the area array radar data d is smaller than 1.6 meters, the central processing module sends a stop instruction, the cleaning robot immediately stops waiting, and if the pedestrian leaves, the cleaning robot continues to run again. If the front part is the obstacle and is not a person, the AI target identification data is 0, when the laser radar detects that the front part has the obstacle and the distance is less than 1.2 meters, the central processing module sends a bypassing instruction, and the cleaning robot continues to advance after bypassing the obstacle.
In fig. 4, the AI target detection algorithm uses Yolo to extract features using a convolutional network, and then uses a fully-connected layer to obtain a predicted value. The network structure is referenced to the GooLeNet model, and contains 24 convolutional layers and 2 fully-connected layers, as shown. For convolutional layers, a 1x1 convolution is used primarily for channle reduction, followed by a 3x3 convolution. For convolutional and fully-connected layers, the Leaky ReLU activation function is used: max (x,0.1x), but the last layer uses a linear activation function.
Multiplying the class information predicted by each grid with the confidence information predicted by the bounding box to obtain the class-specific confidence score of each bounding box:
the first term on the left side of the equation is the category information of each mesh prediction, and the second third term is the confidence of each bounding box prediction. This product, encode, is the probability that the predicted box belongs to a class, and also has information on the accuracy of that box.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.
Claims (10)
1. Cleaning machines people motion control system based on many sensing fuses, its characterized in that includes:
the environment sensing module is used for acquiring environment information of the cleaning robot in a working state;
the bottom layer main control module is used for acquiring working parameter information of the cleaning robot in a working state;
the micro-processing module is used for receiving the information sent by the environment sensing module and the bottom layer main control module and transmitting the received information to the central processing module;
the central processing module is used for processing the received information to obtain decision information and transmitting the decision information to the micro-processing module;
and the servo driving module is used for receiving the decision information sent by the micro-processing module and controlling the motion attitude of the cleaning robot.
2. The multi-sensing fusion based cleaning robot motion control system of claim 1, wherein the environmental information of the cleaning robot in the working state comprises: whether a person or an object exists in front of the cleaning robot or not, and the distance between the cleaning robot and the person or the object in front; the working parameter information of the cleaning robot in the working state comprises acceleration, angular velocity and pose information of the cleaning robot body.
3. The multi-sensing fusion based cleaning robot motion control system according to claim 1 or 2, wherein the environment sensing module comprises:
the AI target recognition module is used for judging whether a person is in front of the cleaning robot;
the area array radar module is used for judging the distance between the human and the cleaning robot body;
and the laser radar module is used for judging whether an obstacle exists in front of the cleaning robot or not and judging the distance between the obstacle and the cleaning robot body under the condition that the obstacle exists.
4. The cleaning robot motion control system based on multi-sensing fusion of claim 1 or 2, wherein the bottom layer main control module comprises:
the inertia measurement unit is used for acquiring the acceleration and the angular speed of the cleaning robot during movement;
and the odometer unit is used for acquiring pose information of the cleaning robot.
5. The multi-sensing fusion based cleaning robot motion control system according to claim 1 or 2, wherein the central processing module processes the received information by using a SLAM algorithm, and the obtained decision information comprises a stop command, a forward command and a detour command; the movement postures of the cleaning robot include a forward movement, a pause movement and a winding movement.
6. The cleaning robot motion control method based on multi-sensing fusion is characterized by comprising the following steps:
s1: the environment sensing module acquires environment information of the cleaning robot in a working state and sends the environment information to the micro-processing module;
s2: the bottom layer main control module acquires working parameter information of the cleaning robot in a working state and sends the working parameter information to the micro-processing module;
s3: the micro-processing module sends the information obtained in S1 and S2 to the central processing module, and the central processing module processes the received information to obtain decision information and transmits the decision information to the micro-processing module;
s4: and the micro-processing module transmits the decision information to the servo driving module, and the servo driving module drives the cleaning robot to move.
7. The multi-sensing fusion based cleaning robot motion control method according to claim 6, wherein when the environment sensing module detects that there is a person in front of the cleaning robot, the central processing module sends a stop instruction, and the cleaning robot suspends the motion; when the environment perception module detects that an obstacle is in front of the cleaning robot, the central processing module sends a detouring instruction, and the cleaning robot performs detouring movement to avoid the obstacle.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 6 or 7.
9. A cleaning robot comprising a robot body and a controller, characterized in that the controller comprises a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program performing the steps of the method according to any of claims 6 or 7.
10. The cleaning robot as claimed in claim 9, wherein an area array laser radar and an AI target recognition lens are provided on a front surface of the cleaning robot, and a laser radar is provided on a front side of a bottom surface of the cleaning robot.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110137126.3A CN112947426A (en) | 2021-02-01 | 2021-02-01 | Cleaning robot motion control system and method based on multi-sensing fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110137126.3A CN112947426A (en) | 2021-02-01 | 2021-02-01 | Cleaning robot motion control system and method based on multi-sensing fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112947426A true CN112947426A (en) | 2021-06-11 |
Family
ID=76240789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110137126.3A Pending CN112947426A (en) | 2021-02-01 | 2021-02-01 | Cleaning robot motion control system and method based on multi-sensing fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112947426A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115251765A (en) * | 2022-07-04 | 2022-11-01 | 麦岩智能科技(北京)有限公司 | Cleaning robot edge sweeping control method based on multiple sensors |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105629970A (en) * | 2014-11-03 | 2016-06-01 | 贵州亿丰升华科技机器人有限公司 | Robot positioning obstacle-avoiding method based on supersonic wave |
DE102016125408A1 (en) * | 2016-12-22 | 2018-06-28 | RobArt GmbH | AUTONOMOUS MOBILE ROBOT AND METHOD FOR CONTROLLING AN AUTONOMOUS MOBILE ROBOT |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
CN109062209A (en) * | 2018-08-07 | 2018-12-21 | 安徽工程大学 | A kind of intelligently auxiliary Ride Control System and its control method |
CN112147615A (en) * | 2020-09-08 | 2020-12-29 | 北京踏歌智行科技有限公司 | Unmanned sensing method based on all-weather environment monitoring system |
-
2021
- 2021-02-01 CN CN202110137126.3A patent/CN112947426A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105629970A (en) * | 2014-11-03 | 2016-06-01 | 贵州亿丰升华科技机器人有限公司 | Robot positioning obstacle-avoiding method based on supersonic wave |
DE102016125408A1 (en) * | 2016-12-22 | 2018-06-28 | RobArt GmbH | AUTONOMOUS MOBILE ROBOT AND METHOD FOR CONTROLLING AN AUTONOMOUS MOBILE ROBOT |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
CN109062209A (en) * | 2018-08-07 | 2018-12-21 | 安徽工程大学 | A kind of intelligently auxiliary Ride Control System and its control method |
CN112147615A (en) * | 2020-09-08 | 2020-12-29 | 北京踏歌智行科技有限公司 | Unmanned sensing method based on all-weather environment monitoring system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115251765A (en) * | 2022-07-04 | 2022-11-01 | 麦岩智能科技(北京)有限公司 | Cleaning robot edge sweeping control method based on multiple sensors |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107562048B (en) | Dynamic obstacle avoidance control method based on laser radar | |
US10102429B2 (en) | Systems and methods for capturing images and annotating the captured images with information | |
US10376117B2 (en) | Apparatus and methods for programming and training of robotic household appliances | |
KR102670610B1 (en) | Robot for airport and method thereof | |
AU2011352997B2 (en) | Mobile human interface robot | |
US7899618B2 (en) | Optical laser guidance system and method | |
Kim et al. | End-to-end deep learning for autonomous navigation of mobile robot | |
JP2022540387A (en) | Mobile robot and its control method | |
CN105629970A (en) | Robot positioning obstacle-avoiding method based on supersonic wave | |
CN105892489A (en) | Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method | |
CN111474930B (en) | Tracking control method, device, equipment and medium based on visual positioning | |
US11471016B2 (en) | Method and apparatus for executing cleaning operation | |
US11498587B1 (en) | Autonomous machine motion planning in a dynamic environment | |
WO2022252221A1 (en) | Mobile robot queue system, path planning method and following method | |
CN110844402B (en) | Garbage bin system is summoned to intelligence | |
CN108762247B (en) | Obstacle avoidance control method for self-moving equipment and self-moving equipment | |
CN204423154U (en) | A kind of automatic charging toy robot based on independent navigation | |
WO2020038155A1 (en) | Autonomous movement device, control method and storage medium | |
Chen et al. | A review of autonomous obstacle avoidance technology for multi-rotor UAVs | |
CN112947426A (en) | Cleaning robot motion control system and method based on multi-sensing fusion | |
Thrunyz | The dynamic window approach to collision avoidance | |
CN112540383A (en) | Efficient following method based on laser human body detection | |
AU2015202200A1 (en) | Mobile human interface robot | |
TWI806429B (en) | Modular control system and method for controlling automated guided vehicle | |
CN115790606B (en) | Track prediction method, device, robot and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210611 |