CN116852360A - Motion control and detection method of man-machine cooperation floor paving robot - Google Patents
Motion control and detection method of man-machine cooperation floor paving robot Download PDFInfo
- Publication number
- CN116852360A CN116852360A CN202310827296.3A CN202310827296A CN116852360A CN 116852360 A CN116852360 A CN 116852360A CN 202310827296 A CN202310827296 A CN 202310827296A CN 116852360 A CN116852360 A CN 116852360A
- Authority
- CN
- China
- Prior art keywords
- robot
- motion
- model
- detection
- detecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 60
- 230000000007 visual effect Effects 0.000 claims abstract description 16
- 230000003993 interaction Effects 0.000 claims abstract description 13
- 238000013136 deep learning model Methods 0.000 claims abstract description 9
- 238000003708 edge detection Methods 0.000 claims abstract description 4
- 239000007788 liquid Substances 0.000 claims description 43
- 238000000034 method Methods 0.000 claims description 21
- 238000013139 quantization Methods 0.000 claims description 12
- 239000004568 cement Substances 0.000 claims description 9
- 239000000463 material Substances 0.000 claims description 9
- 238000009408 flooring Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 3
- 230000002452 interceptive effect Effects 0.000 claims description 2
- 238000012360 testing method Methods 0.000 claims description 2
- 238000012795 verification Methods 0.000 claims description 2
- 238000011179 visual inspection Methods 0.000 claims 1
- 238000010276 construction Methods 0.000 abstract description 13
- 239000000919 ceramic Substances 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000002474 experimental method Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000010355 oscillation Effects 0.000 description 2
- 230000009897 systematic effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009435 building construction Methods 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 238000005336 cracking Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F21/00—Implements for finishing work on buildings
- E04F21/20—Implements for finishing work on buildings for laying flooring
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Architecture (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Civil Engineering (AREA)
- Structural Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention relates to a motion control and detection method of a man-machine cooperation floor paving robot, which comprises motion detection and control and man-machine interaction floor paving visual detection, wherein the motion detection and control specifically comprises the following steps: starting the robot after receiving the expected motor rotation speed; reading the rotating speed of a motor and the heading angle of the robot, and performing double-closed-loop PID control; the visual detection of the man-machine interaction floor paving specifically comprises the following steps: collecting video stream data by adopting a camera; and (3) reasoning the video stream data by using a deep learning model, judging the result, namely judging the flatness and straightness of tile paving by using image edge detection, judging that the tile is well-attached if the reasoning result lasts for a plurality of seconds, and moving the robot to the next point. Compared with the prior art, the invention adopts a man-machine cooperation mode and combines intelligent control, so that the construction is more convenient and faster, the cost is lower, the working efficiency is improved, and the quality and the precision of floor paving are ensured.
Description
Technical Field
The invention relates to the technical field of floor paving automation equipment, in particular to a motion control and detection method of a man-machine cooperation floor paving robot.
Background
In recent years, real estate enterprises perform various exploration and practices in the fields of intelligent construction sites, assembly type buildings, intelligent construction and the like, and the building construction is automated and intelligent. Among them, the development of the floor laying robot is attracting attention. At present, the technological process in the floor tile paving process is mostly finished on site by a master paving machine, and is greatly influenced by factors such as environment, skill and the like. With the advent of flooring robots, standardized flooring is possible, which may reduce the problems of hollowing, cracking, flaking, etc. caused by human factors.
The traditional floor laying mode needs a large amount of manual investment, is time-consuming and labor-consuming, and disordered auxiliary tools enable construction sites to be disordered, finished product protection is difficult to achieve, and the installation experience of workers is different from the installation mode, so that engineering quality cannot be ensured. In the process of paving the floor, the standardized linear floor paving of the robot cannot be well guaranteed in the related technology, the capacity is limited, and the adaptability to the application environment is poor.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a motion control and detection method of a man-machine cooperation floor paving robot.
The aim of the invention can be achieved by the following technical scheme:
as a first aspect of the present invention, there is provided a method of motion control and detection of a human-machine co-flooring application robot, the method comprising motion detection and control and human-machine co-flooring application visual detection, wherein
The motion detection and control specifically comprises: starting the robot after receiving the expected motor rotation speed; reading the rotating speed of a motor and the heading angle of the robot, performing double-closed-loop PID control, and outputting a control signal to an actuator for motion control;
the visual detection of the man-machine interaction floor paving specifically comprises the following steps: when the robot works, a camera is adopted to collect video stream data; and (3) reasoning the video stream data by using the deep learning model, judging the result, and if the reasoning result lasts for a plurality of seconds, judging that the tile is already attached, and moving the robot to the next point.
Further, the course angle is measured by a gyroscope; the motor rotation speed is read by a photoelectric encoder.
Furthermore, the photoelectric encoder converts the rotating speed of the wheel into square waves with two paths of measurable frequencies with different phases, and the speed is measured by reading the number of the square waves.
Further, the robot performs double closed-loop PID control on the motor rotation speed and the course angle of the robot, and the specific steps include:
setting an expected output angle, reading an observed value of a gyroscope to obtain an angle error e, and performing PID control by a controller according to a control scheme to output the rotating speed of a motor;
the controller receives the expected motor rotating speed, reads the actual motor rotating speed through the photoelectric encoder, and performs PID control to enable the left wheel motor and the right wheel motor to generate speed change so as to realize differential speed.
Further, the deep learning model includes the following four steps:
making a voc data set aiming at a target detection scene, finishing data marking work by using a labelimg tool, and processing the marked data according to the format of voc;
selecting a target detection model and a key point detection model, setting corresponding model parameters, performing model training, and evaluating the accuracy, the loss function and the speed of the model;
carrying out data quantization of the model, and carrying out frame quantization and post quantization of the model; testing the loss condition of accuracy before and after model quantization;
and (3) using ncnn to finish the upper plate of the model, and evaluating the maximum and minimum average reasoning time after the upper plate of the model to obtain the real-time frame rate of the model operation.
Furthermore, the visual detection of the paving of the floor is also used for collecting image data through a camera and analyzing the image by adopting an LSD algorithm to detect the straightness and flatness of the paving of the tiles in the image.
Further, the step of detecting the straightness and flatness of the floor paving comprises the following steps:
collecting image data using a camera;
obtaining a linear pixel point set through local analysis of an image by an LSD algorithm, and then carrying out verification and solving through hypothesis parameters, merging the pixel point set and an error control set, and further adaptively controlling the number of false detections;
the LSD algorithm utilizes gradient information and line-row lines to conduct straight line detection and edge detection, and evaluates straightness and flatness of the floor paving.
Further, the robot is further provided with a liquid level sensing device, and the method is controlled as follows based on detection data of the liquid level sensing device:
when the liquid level sensing device detects that the cement amount in the storage box is too small, the robot stops working, and simultaneously alarms to remind a user of feeding;
when the liquid level is at the lowest level and is in a blanking state, the liquid level sensing device immediately gives out a signal prompt, and the robot immediately alarms after receiving the signal;
when the user fills the storage box with cement, the liquid level sensing device detects that the storage box is in a material state at the moment and gives a signal, and the robot resumes work.
Further, the output signal of the liquid level sensing device is high-low level, the output low level is set in a material state, and the output high level is set in a material-free state.
Compared with the prior art, the invention has the following beneficial effects:
the invention designs a human-computer interaction floor paving visual detection system, judges whether paving is finished or not by detecting specific gestures through a camera, realizes human-computer interaction, detects straightness and flatness of floor paving through an LSD algorithm, feeds back, can greatly improve working efficiency and ensures construction quality.
According to the technical scheme provided by the invention, a gyroscope and a photoelectric encoder are applied to perform a double closed-loop PID control algorithm of the motor rotating speed and the angle of the robot in a straight running way, so that the straight running control of the paving robot is better realized.
The invention also provides a liquid level sensing device which monitors and alarms the state in the storage box and optimizes the fluency and reliability of the operation process.
The invention not only ensures the quality and precision of ceramic tile paving, but also greatly improves the flexibility and stability of the whole system of floor paving. In addition, the working mode of man-machine cooperation can be used as an auxiliary tool for intelligent construction, so that the construction progress, engineering quantity and the like can be monitored in real time, and systematic management can be performed.
Drawings
FIG. 1 is a schematic view of the mechanical structure of the present invention;
FIG. 2 is a flow chart of a software system of the present invention;
the labels in the figures are as follows: 1. the device comprises a gyroscope, 2, a photoelectric encoder, 3, wheels, 4, an RGB camera, 5, a power supply, 6, a vehicle-gauge controller, 7, a camera fixing clamp, 8, ceramic tiles, 9, a photoelectric liquid level sensor, 10, a buzzer, 11, a storage box, 12, a motor, 13, a machine support, 14 and a telescopic spring.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples. The present embodiment is implemented on the premise of the technical scheme of the present invention, and a detailed implementation manner and a specific operation process are given, but the protection scope of the present invention is not limited to the following examples.
Example 1
As shown in fig. 1, as one embodiment of the present invention, a motion control and detection method of a man-machine cooperative floor paving robot is provided, the method includes linear motion detection and control based on a gyroscope and a control algorithm, and man-machine interactive floor paving visual detection, and the main steps are as follows:
s1, the robot controller receives the starting of the rotation speed of an expected motor;
s2, a gyroscope immediately measures a course angle;
s3, the photoelectric encoder reads the actual rotation speed of the motor at the moment;
s4, the controller performs PID control and outputs a control signal to the executor;
s5, the rotation speed of the motor is changed, so that the direction of the robot is kept in the internal angle oscillation within the controllable linear range;
s6, putting down the ceramic tile by the robot, laying by a worker, and giving a special gesture by the worker after the floor is laid;
s7, capturing video stream data by the RGB camera, outputting a corresponding reasoning result by the deep learning model, judging that the tile is attached when the reasoning result lasts for a plurality of seconds, sending a signal to a running detection system, and moving the construction vehicle to the next point;
s8, capturing image data by an RGB camera, performing linear detection on the image by an LSD algorithm by utilizing gradient information and line-row lines, detecting straightness and flatness of floor paving, and feeding back detection data to a controller;
s9, when the photoelectric liquid level sensor detects that the cement amount in the storage box is too small, the equipment stops working, and meanwhile, the buzzer sounds or the LED prompt lamp lights to remind a user of feeding;
s10, after the user fills the storage box with cement, the photoelectric liquid level sensor detects that the storage box is in a material state at the moment and gives a signal, and the equipment resumes working.
Based on the linear motion detection and control of the gyroscope and the control algorithm, a six-axis sensor MPU6050 is selected as a motion processing component, and the gyroscope 1 sensor is used for detecting and controlling the running straight line in the motion process of the robot.
The actual rotating speed of the motor 12 is read through the photoelectric encoder 2, the rotating speed of the wheels 3 is converted into square waves with two paths of measurable frequencies with different phases through the photoelectric encoder, the single chip microcomputer is used for measuring the speed by reading the number of the square waves, and the linear walking control is realized through the differential control of the two wheels 3.
After the robot controller receives the starting of the expected motor rotation speed, the gyroscope 1 immediately measures the course angle, the encoder 2 reads the actual rotation speed of the motor at the moment, the PID algorithm detects the deviation of the running direction, and the rotation speed of the wheels 3 is quickly adjusted, so that the direction of the trolley is kept in the controllable linear range to oscillate at the inner angle.
Further, the step S4 controller performs double closed-loop PID control on the motor rotation speed and the heading angle of the robot according to the control scheme, so as to realize stable straight running, and the straight running control principle and strategy are as follows:
1. setting an expected output angle, namely, 0 degree when the vehicle runs straight, reading an observation value of a gyroscope to obtain an angle error e, performing PID control by a controller, controlling the rotating speed of the motor according to a control scheme, and outputting the rotating speed of the motor.
2. The controller receives the expected motor rotating speed, reads the actual rotating speed of the motor through the photoelectric encoder, and performs PID control to enable the left wheel motor and the right wheel motor to generate speed change, so that differential speed is realized.
3. The robot is caused to rotate by the transition of the motor speed.
4. According to the control scheme, double closed-loop PID control is performed on the motor rotating speed and the robot course angle, so that stable straight running can be realized.
The human-computer interaction floor paving visual detection system realizes control detection, and mainly utilizes the RGB camera 4, the power supply 5, the vehicle-gauge controller 6 and the camera fixing clamp 7 in terms of hardware. The RGB camera 5 is selected according to the size of the tile 8, and mainly involves parameters of camera field angle, resolution and camera frame rate, which are determined by the flow of the model, the higher the frame rate, the more sensitive the system. On the basis of understanding the RGB image vision principle, a camera imaging model and a vision coordinate system are studied, and parameters of the RGB camera 5 are obtained through a camera calibration experiment. The vehicle-gauge controller 6 is responsible for model reasoning and system realization, the system is based on a linux system with free open sources, the cpu is not lower than the dual-core A72, and the memory and other performances are required to meet the basic algorithm reasoning requirements.
And S7, the software flow of the man-machine interaction floor paving visual detection system mainly comprises the steps that the RGB camera 4 collects video stream data, and the controller 6 utilizes a deep learning model to infer and judge results. And if the reasoning result lasts for a plurality of seconds, judging that the ceramic tile is already attached, sending a signal to a running detection system, and moving the construction vehicle to the next point.
Further, the deep learning model in step S7 mainly includes four steps:
1. and (3) data set preparation: and making a voc data set aiming at a scene required by the target detection system, mainly using a labelimg tool to finish data marking work, and processing the marked data according to the format of the voc.
2. Model training: selecting a target detection model and a key point detection model, setting corresponding model parameters, performing model training by using a server, and evaluating the accuracy, the loss function and the speed of the model.
3. Model quantization: in order to increase the reasoning speed of the model, data quantization, model framework quantization and post quantization of the model are performed. And the accuracy loss condition of the model needs to be tested before and after quantification.
4. Model upper plate: and (3) finishing the upper plate of the model by using ncnn, and evaluating the maximum and minimum average reasoning time after the upper plate of the model, thereby obtaining the real-time frame rate of the model operation.
The visual detection floor paving straightness and flatness implementation flow mainly collects image data through the RGB camera 4 and analyzes the image by adopting an LSD algorithm, and detects the straightness and flatness of the paving of the ceramic tile 8 in the image.
Further, the specific process of step S8 mainly includes that the RGB camera collects image data, the LSD algorithm obtains a straight-line pixel point set by locally analyzing the image, and then verifies and solves through the assumed parameters, and combines the pixel point set with the error control set, so as to adaptively control the number of false detections. The LSD algorithm uses gradient information and line-row lines for straight line detection and edge detection, can effectively evaluate the straightness and flatness of the floor covering and transmits detection data to the controller 6.
In the liquid level sensing device, when the photoelectric liquid level sensor 9 detects that the cement amount in the storage box is too small, the equipment stops working, and meanwhile, the buzzer 10 sounds or the LED prompt lamp lights to remind a user of feeding.
The liquid level detection is carried out by utilizing the difference of the light rays received in the liquid state and the non-liquid state, when the liquid level is reduced to the lowest level and is in the blanking state, the photoelectric liquid level sensor immediately gives out a signal prompt, and the equipment immediately controls the circuit to realize the alarm of the buzzer or the lighting of the LED lamp after receiving the signal. When the user fills the storage tank with cement, the photoelectric liquid level sensor detects that the storage tank 11 is in a material state at the moment and gives a signal, and the equipment resumes operation.
The output of the liquid level of the photoelectric liquid level sensor is only related to whether the photoelectric probe contacts the liquid level (body) or not, and is not related to other characteristics (temperature, pressure, density and electric parameters) of the medium, so that the liquid level detection is accurate and the repetition precision is high; the response speed is high, and the liquid level control is very accurate.
Further, the output signal of the photoelectric liquid level sensor in the step S10 is high-low level, the output low level of the material state is set, the output high level of the material-free state is set, and the material shortage reminding is effectively achieved.
In summary, by adopting the technical scheme provided by the invention, the gyroscope, the PID control algorithm, the man-machine interaction floor paving visual detection system and the liquid level sensing device are adopted to realize more optimized operation flexibility and reliability, so that the intelligent linear motion detection and control of the robot are realized, and the construction quality and the working efficiency can be greatly improved. The invention not only ensures the quality and precision of ceramic tile paving, but also greatly improves the flexibility and stability of the whole system of floor paving. In addition, the working mode of man-machine cooperation can be used as an auxiliary tool for intelligent construction, so that the construction progress, engineering quantity and the like can be monitored in real time, and systematic management can be performed.
Example 2
As shown in fig. 2, as another embodiment of the present invention, there is provided a tile laying robot adopting the control and detection method described in the above embodiment, the robot comprising a linear motion detection and control system based on a gyroscope and a control algorithm and a human-computer interaction floor laying visual detection system, a pair of crawler wheels (3) being provided on both sides of the robot, a machine bracket (13) being fixed to the tile laying robot, and an end portion of the machine bracket (13) being connected to a tile laying claw through a retractable spring (14) for tile (8) laying. The robot is also provided with a power supply (5), a controller (6), a storage box (11) and a motor (12). The storage box (11) is provided with a liquid level sensing device.
A linear motion detection and control system based on a gyroscope and a control algorithm is arranged on a paving robot, and the hardware aspect mainly comprises: the device comprises a gyroscope (1) for detecting a course angle, a photoelectric encoder (2) for detecting the rotating speed of a motor, a controller (6) and a motor (12). After the controller (6) of the robot receives the starting of the rotating speed of the expected motor (12), the gyroscope (1) immediately measures the course angle, the actual rotating speed of the motor is read through the photoelectric encoder (2), the deviation of the running direction is detected through the PID algorithm, and the rotating speed of the wheels is quickly adjusted, so that the direction of the trolley is kept in the internal angle oscillation within the controllable linear range. And the rotation speed of the motor (12) and the heading angle of the trolley are subjected to double-closed-loop PID control, so that stable straight running can be realized.
The human-computer interaction floor paving visual detection system mainly comprises the following hardware aspects: an RGB camera (4), a power supply (5) and a controller (6) which are fixed on the side of the robot facing the working area through a fixing clamp (7).
The human-computer interaction system visual detection collects video stream data through the RGB camera (4), and the controller (6) utilizes the deep learning model to infer the video stream data and judges results. And if the reasoning result lasts for a plurality of seconds, judging that the ceramic tile (8) is already attached, sending a signal to a running detection system, and moving the construction vehicle to the next point.
The visual detection of the man-machine interaction system can also detect the paving of floors and the straightness and flatness. Image data are collected through an RGB camera (4), an LSD algorithm is adopted to analyze the image, and straightness and flatness of paving of tiles (8) in the image are detected.
The liquid level sensing device adopts a photoelectric liquid level sensor (9), the liquid level is detected by utilizing the difference of light received by a liquid state and a non-liquid state, the photoelectric liquid level sensor (9) is used for detecting the liquid level at the bottom of a storage box (11), when the liquid level is reduced to the lowest level, the sensor immediately gives out a signal prompt, and the equipment immediately controls a circuit to realize the alarm of a buzzer (10) or the lighting of a warning lamp after receiving the signal. When the user fills the storage box with cement, the photoelectric liquid level sensor detects that the storage box (11) is in a material state at the moment and gives out a signal, and the equipment resumes working.
The foregoing describes in detail preferred embodiments of the present invention. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the invention by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.
Claims (9)
1. A method for controlling and detecting the motion of a human-computer cooperative floor laying robot, which is characterized by comprising the steps of detecting and controlling the motion and detecting human-computer interactive floor laying vision, wherein
The motion detection and control specifically comprises: starting the robot after receiving the expected motor rotation speed; reading the rotating speed of a motor and the heading angle of the robot, performing double-closed-loop PID control, and outputting a control signal to an actuator for motion control;
the visual detection of the man-machine interaction floor paving specifically comprises the following steps: when the robot works, a camera is adopted to collect video stream data; and (3) reasoning the video stream data by using the deep learning model, judging the result, and if the reasoning result lasts for a plurality of seconds, judging that the tile is already attached, and moving the robot to the next point.
2. The method for controlling and detecting the motion of a human-computer collaborative floor laying robot according to claim 1, wherein the heading angle is measured by a gyroscope; the motor rotation speed is read by a photoelectric encoder.
3. The method for controlling and detecting the motion of the human-computer collaborative floor laying robot according to claim 2, wherein the photoelectric encoder converts the rotation speed of the wheels into square waves with two paths of measurable frequencies with different phases, and the speed is measured by reading the number of the square waves.
4. The method for controlling and detecting the motion of a robot for paving a floor with human-computer cooperation according to claim 2, wherein the robot performs double closed-loop PID control on the rotation speed of a motor and the heading angle of the robot, and the specific steps include:
setting an expected output angle, reading an observed value of a gyroscope to obtain an angle error e, and performing PID control by a controller according to a control scheme to output the rotating speed of a motor;
the controller receives the expected motor rotating speed, reads the actual motor rotating speed through the photoelectric encoder, and performs PID control to enable the left wheel motor and the right wheel motor to generate speed change so as to realize differential speed.
5. The method for controlling and detecting the motion of a human-computer collaborative floor laying robot according to claim 1, wherein the deep learning model includes four steps:
making a voc data set aiming at a target detection scene, finishing data marking work by using a labelimg tool, and processing the marked data according to the format of voc;
selecting a target detection model and a key point detection model, setting corresponding model parameters, performing model training, and evaluating the accuracy, the loss function and the speed of the model;
carrying out data quantization of the model, and carrying out frame quantization and post quantization of the model; testing the loss condition of accuracy before and after model quantization;
and (3) using ncnn to finish the upper plate of the model, and evaluating the maximum and minimum average reasoning time after the upper plate of the model to obtain the real-time frame rate of the model operation.
6. The method for controlling and detecting the motion of the human-computer collaborative floor laying robot according to claim 1, wherein the visual inspection of the floor laying also acquires image data through a camera and analyzes the image by adopting an LSD algorithm, and detects the straightness and flatness of tile laying in the image.
7. The method for controlling and detecting the motion of a human-computer collaborative flooring application robot according to claim 6, wherein the step of detecting the straightness and flatness of the flooring application comprises:
collecting image data using a camera;
obtaining a linear pixel point set through local analysis of an image by an LSD algorithm, and then carrying out verification and solving through hypothesis parameters, merging the pixel point set and an error control set, and further adaptively controlling the number of false detections;
the LSD algorithm utilizes gradient information and line-row lines to conduct straight line detection and edge detection, and evaluates straightness and flatness of the floor paving.
8. The method for controlling and detecting the motion of a human-machine cooperative floor covering robot according to claim 1, wherein the robot is further provided with a liquid level sensing device, and the method is controlled based on detection data of the liquid level sensing device as follows:
when the liquid level sensing device detects that the cement amount in the storage box is too small, the robot stops working, and simultaneously alarms to remind a user of feeding;
when the liquid level is at the lowest level and is in a blanking state, the liquid level sensing device immediately gives out a signal prompt, and the robot immediately alarms after receiving the signal;
when the user fills the storage box with cement, the liquid level sensing device detects that the storage box is in a material state at the moment and gives a signal, and the robot resumes work.
9. The method for controlling and detecting the motion of the human-computer collaborative floor laying robot according to claim 8, wherein the output signal of the liquid level sensing device is high and low, a material state output low level is set, and a material-free state output high level is set.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310387859.1A CN116300963A (en) | 2023-04-12 | 2023-04-12 | Motion control and detection method of man-machine cooperation floor paving robot and robot |
CN2023103878591 | 2023-04-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116852360A true CN116852360A (en) | 2023-10-10 |
Family
ID=86797874
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310387859.1A Pending CN116300963A (en) | 2023-04-12 | 2023-04-12 | Motion control and detection method of man-machine cooperation floor paving robot and robot |
CN202310827296.3A Pending CN116852360A (en) | 2023-04-12 | 2023-07-07 | Motion control and detection method of man-machine cooperation floor paving robot |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310387859.1A Pending CN116300963A (en) | 2023-04-12 | 2023-04-12 | Motion control and detection method of man-machine cooperation floor paving robot and robot |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN116300963A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117808324B (en) * | 2024-02-27 | 2024-06-04 | 西安麦莎科技有限公司 | Building progress assessment method for unmanned aerial vehicle vision coordination |
-
2023
- 2023-04-12 CN CN202310387859.1A patent/CN116300963A/en active Pending
- 2023-07-07 CN CN202310827296.3A patent/CN116852360A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN116300963A (en) | 2023-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105045950B (en) | A kind of bridge security assessment system based on 3 D laser scanning | |
Rome et al. | Towards autonomous sewer robots: the MAKRO project | |
CN102506737B (en) | Pipeline detection device | |
CN103433810B (en) | Complicated curve surface normal vector on-machine detection device and method | |
CN104932507B (en) | A kind of night patrol machine people automatic tracking method | |
CN106142104A (en) | Self-movement robot and control method thereof | |
US20230236608A1 (en) | Method and system for inspecting a building construction site using a mobile robotic system | |
CN116852360A (en) | Motion control and detection method of man-machine cooperation floor paving robot | |
JP2009123061A (en) | System for detecting robot position | |
CN111754638B (en) | Automatic dust suppression and dust fall system in storage yard and dust suppression and dust fall method in storage yard | |
CN108151766B (en) | Positioning method of magnetic nails, positioning navigation error correction method of magnetic nails and positioning device | |
CN113189977A (en) | Intelligent navigation path planning system and method for robot | |
CN102331296A (en) | Method, device and system for detecting vibration of arm frame of engineering machine, and engineering machine | |
CN216645248U (en) | Reinforcing bar interval detection device | |
CN105516688A (en) | Resolution-transforming type eagle eye-mimic visual imaging device and imaging method thereof | |
CN109933069A (en) | The conducting wire flaw detection robot tele-control system and control method of view-based access control model and force feedback | |
Yan et al. | Multi-line laser structured light fast visual positioning system with assist of TOF and CAD | |
CN112947461B (en) | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method | |
CN209469434U (en) | Development machine | |
CN105333863A (en) | 3D footprint quantitative test and analysis system for police | |
JP5087360B2 (en) | Inspection system | |
CN103363916A (en) | Information processing method and processing device | |
CN109209418A (en) | Development machine and its control method | |
CN116045908A (en) | Method and system for measuring inclination angle of transmission tower body | |
Wang et al. | Robot floor‐tiling control method based on finite‐state machine and visual measurement in limited FOV |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |