CN112147995A - Robot motion control method and device, robot and storage medium - Google Patents

Robot motion control method and device, robot and storage medium Download PDF

Info

Publication number
CN112147995A
CN112147995A CN201910579972.3A CN201910579972A CN112147995A CN 112147995 A CN112147995 A CN 112147995A CN 201910579972 A CN201910579972 A CN 201910579972A CN 112147995 A CN112147995 A CN 112147995A
Authority
CN
China
Prior art keywords
robot
motion
module
micro
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910579972.3A
Other languages
Chinese (zh)
Other versions
CN112147995B (en
Inventor
刘洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Maker Works Technology Co ltd
Original Assignee
Shenzhen Maker Works Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Maker Works Technology Co ltd filed Critical Shenzhen Maker Works Technology Co ltd
Priority to CN201910579972.3A priority Critical patent/CN112147995B/en
Publication of CN112147995A publication Critical patent/CN112147995A/en
Application granted granted Critical
Publication of CN112147995B publication Critical patent/CN112147995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

A robot motion control method and apparatus, a robot, and a storage medium. The invention discloses a motion control method and a device of a robot, the method is characterized in that according to static attitude information output by a micro-mechanical inertia module and a map area code read by a photosensitive module in a static state, the measuring error of the micromechanical inertia element is corrected, so that after the robot is switched from a static state to a motion state according to the triggered motion control signal, the micro-mechanical inertia module can carry out attitude calculation according to the accurate measurement result output by the micro-mechanical inertia element, thereby accurately obtaining the current motion attitude information of the robot, and then the robot can accurately correct the motion attitude in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module in the motion process, so that the robot is ensured to move on the spliced jigsaw puzzle according to the motion path indicated by the motion control signal.

Description

Robot motion control method and device, robot and storage medium
Technical Field
The invention relates to the technical field of communication, in particular to a motion control method and device of a robot, the robot and a computer readable storage medium.
Background
When the robot moves for a long distance, the robot may deviate from the predetermined movement path due to poor road conditions, deviation of the robot itself, and the like.
In the existing implementation, the current position is detected by a camera installed on the robot every preset time to obtain the current position characteristic information of the robot, the obtained position characteristic information is sent to a processing module, the processing module replans a motion path for the robot, and the robot moves according to the replanned motion path, so that the problem of deviation from the preset motion path can be avoided.
However, such a solution has strict requirements on factors such as the installation position and number of the cameras, the external environment, and the like, for example, the installation position and ambient light of the cameras all affect the accuracy of the current position imaging, and thus affect the accuracy of the acquired position characteristic information, the more the number of the cameras is, the more accurate the position characteristic information is, and meanwhile, the larger the calculation amount is, the more easily the performance of the robot is affected, and thus it is difficult to accurately control the movement path of the robot.
Therefore, how to accurately control the motion path of the robot is an urgent problem to be solved.
Disclosure of Invention
In order to solve the technical problem that the motion path of the robot cannot be accurately controlled in the prior art, the invention provides a motion control method and device of the robot, the robot and a computer readable storage medium, which are used for correcting the motion posture of the robot in real time so that the robot moves according to a preset motion path.
The technical scheme adopted by the invention is as follows:
a method of motion control of a robot, comprising:
a motion control apparatus of a robot, comprising:
a robot comprising a processor and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above described method of motion control of a robot via execution of the executable instructions.
A computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the motion control method of the robot described above.
In the technical scheme, the measurement error of the micro-mechanical inertial element is corrected according to the static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module in the static state, so that after the robot is switched from the static state to the motion state according to the triggered motion control signal, the micro-mechanical inertial module can perform attitude calculation according to an accurate measurement result output by the micro-mechanical inertial element, the current motion attitude information of the robot is accurately obtained, the robot can accurately correct the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertial module in the motion process, and the robot is ensured to move on a plurality of spliced puzzles according to the motion path indicated by the motion control signal.
Compared with the prior art, the current position information of the robot is accurately obtained through the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module, is not influenced by factors such as a camera, an external environment and the like, can accurately correct the motion attitude of the robot, and ensures the accuracy of the motion path of the robot.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic illustration of an implementation environment in which the present invention is concerned;
FIG. 2 is a schematic illustration of a road map involved in the implementation environment shown in FIG. 1;
FIG. 3 is a diagram illustrating angle codes contained in tiles involved in the implementation environment shown in FIG. 1;
FIG. 4 is a schematic diagram of a grid map involved in the implementation environment shown in FIG. 1;
FIG. 5 is a block diagram illustrating a hardware configuration of a robot in accordance with an exemplary embodiment;
FIG. 6 is a flow chart illustrating a method for motion control of a robot in accordance with an exemplary embodiment;
FIG. 7 is a flow chart for one embodiment of step 210 in the corresponding embodiment of FIG. 6;
FIG. 8 is a flow chart for one embodiment of step 250 in the corresponding embodiment of FIG. 6;
FIG. 9 is a flowchart of one embodiment of step 253 in the corresponding embodiment of FIG. 7;
FIG. 10 is a flow chart of step 250 in another embodiment of the corresponding embodiment of FIG. 6;
fig. 11 is a flowchart illustrating a motion control method of a robot according to another exemplary embodiment;
fig. 12 is a block diagram of a motion control apparatus of a robot according to an exemplary embodiment.
While specific embodiments of the invention have been shown by way of example in the drawings and will be described in detail hereinafter, such drawings and description are not intended to limit the scope of the inventive concepts in any way, but rather to explain the inventive concepts to those skilled in the art by reference to the particular embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment according to the present invention. As shown in fig. 1, the environment of the present invention includes several tiles that are tiled adjacently, on which the robot moves. It should be noted that the robot described in the present invention refers to a machine device that automatically executes a motion process.
As shown in fig. 1, in an embodiment, the tiled jigsaw is a road map, each road map is laid with a road pattern and a road edge pattern, and the road maps are tiled according to the road pattern to obtain a moving path of the robot, so that the robot moves along the road pattern.
Fig. 2 shows 3 different types of road maps, where the jigsaw 5 is a straight road map, the jigsaw 6 is a cross road map, and the jigsaw 7 is a curved road map, and the road maps of the same type or different types can be combined and spliced to obtain different movement paths. For example, as shown in fig. 1, the puzzle 1 and the puzzle 2 are both straight maps, the two maps are spliced together to obtain a straight path, the puzzle 2 and the puzzle 4 are spliced together to obtain a turning path through the puzzle 3, and the robot moves along the puzzle 1 and the puzzle 2 to the center of the puzzle 3, rotates 90 ° to the right, and then continues to move straight along the puzzle 4.
The road map is divided into at least 3 map areas including a road area, a common boundary area and an edge area, and the number of the common boundary area and the edge area divided by different types of road maps is different, as shown in fig. 2, a straight-going map includes 2 common boundary areas and upper and lower edge areas, a cross road map includes 4 common boundary areas and upper 4 side corner edge areas, and a curved road map includes 2 common boundary areas and 3 edge areas.
Different map areas of the road map also contain corresponding map area codes, wherein the angle codes shown in figure 3 are laid on the whole road map, road codes are laid in the road areas, and different road maps can be distinguished through the road codes; public boundary codes are laid in the public boundary areas, and the public boundary codes laid in the public boundary areas of each road map are the same; the edge area is also paved with edge area codes, the edge area codes are related to the positions of the corresponding edge areas relative to the road area, and the edge area codes corresponding to the same positions relative to the road area are the same.
It should be noted that the angle codes shown in fig. 3 are laid on each road map, and when the road maps are spliced, only the correspondence between the road patterns laid on the road maps needs to be considered, and it is not necessary to ensure that the laying directions of the angle codes are consistent.
In an exemplary embodiment, the map area codes are printed on the road map by a special paint which can reflect red light, and the robot can identify and read the map area codes by a photosensitive module which is configured by the robot.
In another embodiment, the tiles that are stitched are a grid map as shown in FIG. 4. As shown in fig. 4, a scene pattern or a character pattern is correspondingly laid on the grid map, for example, a grass pattern in a forest exploration scene is laid on the puzzle 8, an animal pattern is laid on the puzzle 9, and when the grid map is spliced, the grid map can be arbitrarily combined according to the scene pattern and the character pattern.
The grid map is divided into map areas such as a central area, a public boundary area and edge areas respectively positioned on the upper, lower, left and right sides of the central area, when the robot moves on the spliced grid map, the robot can move straight along the central area of the grid map, and can rotate in place for a certain angle in the central area of the grid map, or the combination of the two.
The map areas divided by the grid map also contain corresponding map area codes, wherein the angle codes as shown in fig. 3 are laid on the whole grid map, the edge areas are correspondingly laid with edge area codes, and the common boundary areas are laid with common boundary codes. In contrast to the road map, the central area of the grid map is further covered with a position code or a central area code, for example, the central area shown by tile 8 is covered with a position code, and the central area shown by tile 9 is covered with a central area code, where the position code refers to the position coordinates where the central area is located.
In an exemplary embodiment, the edge area code is set according to an angle code laid by the grid map. As shown in tiles 8 and 9 in fig. 3, the edge region code on the X-axis minimum side of the tile is set to the minimum X-axis, the edge region code on the X-axis maximum side is set to X +2, the edge region code on the Y-axis minimum side is set to the maximum X +1, and the edge region code on the Y-axis maximum side is set to X + 3. When the central region is laid with a central region code, the central region code is X +4 accordingly.
In an exemplary embodiment, the minimum value X of the edge region code is set according to the scene pattern or character pattern laid on the grid map, for example, the grass pattern in the forest exploration scene shown by the jigsaw puzzle 8 is not consistent with the X value corresponding to the animal pattern shown by the jigsaw puzzle 9.
In another embodiment, the spliced jigsaw can also be a blank map, and when the robot moves on the spliced blank map, the robot moves according to the movement strategy on the road map or the grid map, so that the movement processes of the robot all occur in the central area of the blank map.
In another embodiment, the tiles to be stitched may also be a combination of the road map, the grid map, and the blank map, which is not limited herein.
Fig. 5 is a block diagram illustrating a hardware configuration of a robot according to an exemplary embodiment. It should be noted that the robot is only an example adapted to the present invention, and should not be considered as providing any limitation to the scope of the present invention. The robot is also not to be construed as necessarily relying on or having to have one or more components of the exemplary robot shown in fig. 5.
As shown in fig. 4, the robot includes a processor 101, a memory 102, a photosensitive module 104, a micromechanical inertial module 105, a remote control receiving module 103, a driving motor 106, and an encoder 107.
The processor 101 is used as a core module of the robot for data processing, and is used for calculating data stored in the memory 102.
The memory 102 is used for storing computer readable instructions and modules corresponding to a motion control method of a robot as shown in the exemplary embodiment of the present invention, and the processor 101 executes the computer readable instructions stored in the memory 102, thereby performing various functions and data processing, for example, implementing motion control of the robot. The memory 102 may be random access memory, such as high speed random access memory, non-volatile memory, one or more magnetic storage devices, flash memory, or other solid state memory. The storage of memory 102 may be transient or permanent.
The remote control receiving module 103 is configured to receive a motion control signal transmitted by a remote controller, and the processor 101 obtains control information for controlling a motion state of the robot by processing the motion control signal, so as to control the robot to move on the puzzle according to a motion path indicated by the motion control signal, or to control the robot to stop moving.
The photosensitive module 104 is also called a photosensitive pen or an optical recognition instrument, and when the robot moves on a jigsaw puzzle, the photosensitive module 104 contacts with the surface of the jigsaw puzzle and reads a map area code corresponding to a map area divided by the jigsaw puzzle to acquire special information of the robot, such as position information, a yaw angle and the like, so that the motion posture of the robot is corrected in real time according to the acquired special information, and the robot is prevented from deviating from a motion path indicated by a motion control signal.
It should be noted that the motion attitude of the robot refers to a yaw angle of the robot during the motion process, and when the motion of the robot deviates from the motion path indicated by the motion control signal, since the robot cannot translate, the motion path needs to be corrected during the motion process after the current yaw angle of the robot is corrected.
The micromechanical inertial module 105 is also referred to as a mems inertial module, and includes a micromechanical inertial element and an attitude solution unit. The micro-mechanical inertia element comprises a gyroscope and an accelerometer, and is used for measuring the motion angular velocity and the acceleration of the robot in real time so as to obtain the measurement result of the micro-mechanical inertia element. The attitude calculation unit is provided with computer readable instructions which, when executed, perform attitude calculation on the robot according to the measurement result output by the micromechanical inertial element, thereby obtaining attitude information of the robot.
Illustratively, the attitude information includes angular information such as a roll angle, a pitch angle, a yaw angle, and the like of the robot, and also includes current position information and speed information of the robot, wherein the yaw angle refers to angular information of the robot in a horizontal direction, and can be used for determining a real-time direction of the robot. The robot can also correct the motion attitude in real time according to the attitude information output by the micro-mechanical inertia module 105, so that the robot is prevented from deviating from the motion path indicated by the motion control signal.
The driving motor 106 is used for driving the joint motion of the robot according to the control information output by the processor 101, so as to drive the robot to perform motion actions such as walking and rotation.
The encoder 107 is a sensor for acquiring real-time motion information of the robot, and the encoder 107 has its own encoding scale for recording the motion distance of the robot, so that the motion speed of the robot can be obtained according to the motion time and the motion distance of the robot. Through the motion information output by the encoder 107, the robot can control the position and the speed of the robot, so that the precision of the conventional motion of the robot is ensured.
In order to avoid the robot deviating from the preset motion path when moving on the puzzle, in the exemplary embodiment shown in fig. 6, a motion control method of the robot is provided, which, as shown in fig. 6, comprises at least the following steps:
in a static state, the robot corrects the measurement error of the micromechanical inertial element in the micromechanical inertial module according to the static attitude information output by the micromechanical inertial module and the map area code read by the photosensitive module in step 210.
As mentioned above, in the process of moving the robot on the puzzle, the moving posture of the robot needs to be corrected in real time according to the map area code read by the photosensitive module or the posture information output by the micro-mechanical inertial module, so as to prevent the robot from deviating from the predetermined moving path. In order to accurately correct the motion attitude of the robot, a micro-mechanical inertial module is required to output accurate attitude information, and a photosensitive module is required to accurately read a map area code on a jigsaw puzzle.
The attitude information output by the attitude calculation unit comprises a rolling angle, a pitch angle and a yaw angle, wherein the rolling angle and the pitch angle can be corrected through acceleration information in three axial directions output by an accelerometer in the micro-mechanical inertial element, but the yaw angle cannot be corrected, so that the attitude information output by the attitude calculation unit has an error with the real attitude of the robot.
Because the attitude information output by the attitude calculation unit is obtained by performing attitude calculation on the measurement result of the micromechanical inertial element, the essential reason that the attitude information output by the attitude calculation unit has errors is that the micromechanical inertial element has measurement errors, and the measurement errors of the micromechanical inertial element are accumulated continuously along with time, so that the errors of the attitude information obtained by performing attitude calculation on the measurement result output by the micromechanical inertial element by the attitude calculation unit are larger and larger, and therefore, the measurement errors of the micromechanical inertial element need to be corrected.
When the robot is in a static state, the theoretical speed of the robot is zero, the static attitude information obtained by resolving the output measurement result of the micro-mechanical inertial element is resolved by the attitude resolving unit, the error between the static attitude information and the position information, the yaw angle and other information obtained by reading map area codes by the photosensitive module reflects the position error, the yaw angle error and the speed error of the robot, and the constant value drift of a gyroscope in the micro-mechanical inertial element and the zero offset of the accelerometer can be estimated according to the obtained position error, the yaw angle error and the speed error, so that the measurement error of the micro-mechanical inertial element is obtained.
Therefore, when the robot is in a static state, the measurement error of the micro-mechanical inertial element in the micro-mechanical inertial module can be obtained according to the static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module.
The process of performing error correction on the micromechanical inertial element according to the obtained measurement error means that after the micromechanical inertial element measures and obtains a corresponding angular velocity and acceleration, the measurement error needs to be subtracted from the obtained angular velocity and acceleration, and the obtained difference is output as a measurement result.
It can be seen that the process of error correction of the micromechanical inertial element must be performed by resting the robot on a puzzle containing map area code, such as any of the road maps or grid maps described above, without limitation.
And step 230, switching the robot from a static state to a motion state according to the triggered motion control signal, wherein the motion control signal indicates that the robot moves through the designated map area.
After the micro-mechanical inertial element is subjected to error correction in a static state, the micro-mechanical inertial element can output an accurate measurement result, so that the attitude calculation unit can output accurate attitude information, and the motion path of the robot can be corrected according to the attitude information output by the micro-mechanical inertial module configured in the robot.
In an exemplary embodiment, the triggered motion control signal is received by the robot through a remote control receiving module as shown in fig. 5. For example, a user may trigger a control device (e.g., a control button, a control lever, etc.) disposed on the remote controller to enable the remote controller to generate a corresponding motion control signal and transmit the motion control signal to the remote control receiving module. The robot moves on the jigsaw according to the motion control signal received by the remote control receiving module, and the static state is switched to the motion state.
In other embodiments, the motion control signal may be obtained by other means, and is not limited herein. Illustratively, the robot is configured with a motion path presetting device, and a user presets a motion path for the robot by triggering the motion path presetting device, so that the robot moves on the jigsaw puzzle according to the preset motion path.
It should be noted that the motion control signal is used to instruct the robot to move through a designated map area on the route puzzle. For example, in the case where the robot moves on a road map, the movement control signal instructs the robot to perform linear movement (including forward or backward movement) in a road area of a straight road map adjacently spliced, to perform rotation of a fixed angle when the robot moves to a central position of a cross road map or a curved road map, or to stop moving when the robot moves to a designated place.
In the case where the robot moves on the grid map, the motion control signal may instruct the robot to perform a linear motion along the central area of the grid map, or instruct the robot to rotate in place by a fixed angle in the central area of the grid map.
After the robot is switched to the motion state, the robot moves on the jigsaw puzzle according to the motion path indicated by the motion control signal. In the moving process of the robot, the robot deviates from the moving path indicated by the motion control signal due to the fact that the surface of the jigsaw puzzle may be uneven or the robot has deviation, and therefore the moving posture of the robot needs to be corrected in real time in the moving process.
And step 250, correcting the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module, wherein the motion attitude information is obtained by performing attitude calculation on the measurement result output by the micro-mechanical inertia element by the micro-mechanical inertia module.
As mentioned above, in the motion of the robot, the motion attitude of the robot refers to the yaw angle of the robot, and when the motion path of the robot deviates from the motion path indicated by the motion control signal, the current yaw angle of the robot is corrected, so that the robot gradually recovers the motion path indicated by the motion control signal in the motion process. That is, the correction of the motion attitude of the robot can be realized by correcting the yaw angle of the robot in the motion process in real time.
Since the map area code read by the photosensitive module and the motion attitude information output by the micro-mechanical inertia module can be used for correcting the motion attitude of the robot, the robot needs to select an appropriate mode to perform correction of the motion attitude.
In an exemplary embodiment, the robot preferentially reads the map area code on the current jigsaw through the photosensitive module, and when the photosensitive module reads the map area code, the robot corrects the motion posture of the robot in real time according to the read map area code.
And if the photosensitive module cannot read the map area code, the robot executes real-time correction of the motion attitude according to the motion attitude information output by the micro-mechanical inertia module. For example, when the map area code laid by the puzzle is worn or the robot moves on the puzzle of the blank map, the photosensitive module cannot read the corresponding map area code.
In another embodiment, the robot may select any one of the manners to correct the motion pose of the robot, which is not limited herein.
It should be noted that, since the measurement error of the micro-mechanical inertial element is corrected in step 210, in the motion of the robot, the measurement error of the micro-mechanical inertial element is removed from the measurement result output by the micro-mechanical inertial element, and the motion attitude information obtained by resolving the measurement result by the attitude resolving unit is accurate attitude information, and can be used for accurately correcting the motion attitude of the robot.
In the embodiment, the measurement error of the micro-mechanical inertial element is corrected in the static state, so that after the robot is switched from the static state to the motion state, the micro-mechanical inertial module can accurately obtain the current motion attitude information of the robot, and the motion attitude information which can be read by the photosensitive module or output by the micro-mechanical inertial module in the motion process of the robot is also accurate, so that the motion attitude of the robot can be accurately corrected in real time, and the robot is ensured to move on a plurality of spliced puzzles according to the motion path indicated by the motion control signal.
FIG. 7 is a flow chart of one embodiment of step 210 in the corresponding embodiment of FIG. 6. As shown in fig. 6, the process of correcting the measurement error of the micromechanical inertial element in the rest state at least comprises the following steps:
and step 211, in a static state, the robot acquires static attitude information output by the micromechanical inertia module and a map area code read by the photosensitive module.
As mentioned above, since the theoretical speed is zero when the robot is in a static state, the static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module reflect the position error, the yaw angle error and the speed error of the robot, and the measurement error of the micro-mechanical inertial element can be obtained according to the position error, the yaw angle error and the speed error. Therefore, it is necessary to obtain the static attitude information output by the micromechanical inertial module and the map area code read by the photosensitive module.
It should be noted that the static attitude information output by the attitude calculation unit includes current position information, yaw angle and speed information of the robot, and the photosensitive module obtains the current position information and yaw angle of the robot by reading map area codes on the puzzle.
Because the map area code is correspondingly laid on the jigsaw puzzle after being accurately designed, and the position information and the yaw angle obtained by the photosensitive module through reading the map area code are also accurate, the information error between the static attitude information output by the micro-mechanical inertia module and the map area code read by the photosensitive module is caused by the measurement error of the micro-mechanical inertia module, and the measurement error of the micro-mechanical inertia element can be correspondingly obtained through the information error.
Step 213, obtaining the measurement error of the micromechanical inertial element in the micromechanical inertial module by performing difference calculation on the static attitude information and the map area code and performing filtering processing on the calculation result.
The difference calculation of the static attitude signal and the map area code means that the difference between the position information, the yaw angle and the speed information corresponding to the static attitude signal and the map area code is calculated respectively, so that the position error, the yaw angle error and the speed error of the robot are obtained.
The position error, the yaw angle error and the speed error of the robot are input into a filter for filtering processing, so that the filter can well estimate the constant drift of a gyroscope and the zero offset of an accelerometer in the micromechanical inertial element, and the measurement error of the micromechanical inertial element is obtained.
The filter includes any one of a kalman filter, an adaptive filter, or other filters, and the kalman filter may be used in this embodiment, which is not limited here. The output result of the filter is the measurement error of the micromechanical inertia element.
Step 215, calculating a difference value between the information obtained by the measurement of the micromechanical inertial element and the measurement error, and obtaining the difference value as a measurement result output by the micromechanical inertial element.
After the measuring error of the micro-mechanical inertia element is obtained through the filter, the difference between the information obtained by measuring the micro-mechanical inertia element and the measuring error is calculated, and the difference is obtained as the measuring result output by the micro-mechanical inertia element.
That is, after the micro-mechanical inertial element measures the corresponding angular velocity and acceleration, the measurement error needs to be subtracted from the angular velocity and acceleration, and the difference is output as the measurement result.
In the embodiment, after the robot is switched to the motion state, the measurement error of the micro-mechanical element is corrected in the static state, so that the measurement result output by the micro-mechanical inertial element is an accurate result obtained by removing the constant drift of the gyroscope and the zero offset of the accelerometer, and therefore, the motion attitude information output by the micro-mechanical inertial module is also accurate, and a data base is laid for the accurate correction of the motion attitude of the robot.
FIG. 8 is a flow chart of one embodiment of step 250 of the corresponding embodiment of FIG. 6. As shown in fig. 8, the motion state switched by the robot according to the motion control signal includes that the robot performs linear motion in a map area specified by a path, and the process of performing real-time correction on the motion attitude of the robot described in step 250 at least includes the following steps:
and 251, acquiring the position information and the yaw angle of the robot on the jigsaw puzzle according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module.
The position information of the robot on the jigsaw puzzle is used for reflecting the degree of deviation of the robot from a motion path indicated by the motion control signal, the yaw angle of the robot represents the current motion attitude of the robot, and when the motion attitude of the robot is corrected, the yaw angle needs to be corrected in real time according to the current position information of the robot.
The motion attitude information output by the micro-mechanical inertia module comprises the current position information and the yaw angle of the robot, but the map area code is read by the photosensitive module, and the current position information and the yaw angle of the robot cannot be directly obtained, so that the position information and the yaw angle need to be correspondingly obtained according to the read map area code.
The detailed process of acquiring the position information and the yaw angle of the robot on the puzzle according to the map area code read by the light sensing module will be described below by taking a grid map as an example.
As mentioned above, since the angle code as shown in fig. 3 is laid on the whole grid map, when the robot moves to any position on the grid puzzle, the photosensitive module can read the angle code corresponding to the current position of the robot, and the angle code is the current yaw angle of the robot.
When the robot moves to a public boundary area or an edge area, the map area codes read by the photosensitive module comprise public boundary codes and edge area codes besides angle codes; when the robot moves to the central area, the map area codes read by the photosensitive module comprise position codes or central area codes besides angle codes, and the type of the map area codes corresponding to the read central area depends on the type of the grid map.
Under the condition that the central area contains the position code and the robot moves in the central area, the photosensitive module can directly read the current position information of the robot.
In addition, the map area code read by the photosensitive module cannot directly reflect the current position information of the robot. In an exemplary embodiment, the process of acquiring the current position information of the robot according to the map area code read by the photosensitive module is as follows:
after a map area code corresponding to a map area where the robot is located currently is read through the photosensitive module, a difference value between the read map area code and a minimum value of edge area codes in a sub-map is calculated, then a remainder is measured by the difference value to the total number of the edge areas or the total number of the edge areas and a central area, a current position coordinate of the robot is set according to the obtained remainder, and current position information of the robot is obtained.
If the central area of the grid map contains the position code, the difference value between the map area code and the minimum value of the edge area code is made to be the remainder of 4; if the central area of the grid map contains the central area code, the difference is made to be the remainder of 5.
When the obtained remainder is 0, the fact that the robot is located in an edge area on one side of the minimum value of the X axis currently is shown, the robot moves along the direction of the X axis, and the position coordinate of the X axis is set to be the minimum value; similarly, the Y-axis position coordinate is set to the minimum value when the remainder is 1, the X-axis position coordinate is set to the maximum value when the remainder is 2, and the Y-axis position coordinate is set to the maximum value when the remainder is 3.
It should be noted that the maximum value and the minimum value of the X-axis coordinate and the Y-axis coordinate are both preset values, for example, the position coordinate range of the X-axis coordinate and the Y-axis coordinate is preset to be 30-110, the maximum value of the X-axis coordinate and the Y-axis coordinate is preset to be 110, the minimum value of the X-axis coordinate and the Y-axis coordinate is preset to be 30, and the central position coordinate of the grid mosaic is correspondingly (70, 70).
And step 253, calculating the motion error amount of the robot performing linear motion according to the position information and the yaw angle.
The motion error amount of the robot performing linear motion comprises a position error and a first yaw angle error, the position error is an error between current position information of the robot and a central position of the jigsaw puzzle, and the first yaw angle error is an error between a current yaw angle of the robot and a target yaw angle.
The target yaw angle is obtained according to the angle code read by the photosensitive module, and the target yaw angle is used for representing the target movement direction of the robot. As shown in fig. 3, the angle codes laid on the grid map are marked with an X axis and a Y axis, when the angle codes read by the photosensitive module correspond to the current yaw angle of the robot, which is 45 ° to 135 °, it is indicated that the robot moves on both sides of the X axis, and the target movement direction of the robot is the positive direction of the X axis, so that the acquired target yaw angle is 90 °. Similarly, when the current yaw angle is 135-225 degrees, the target yaw angle is 180 degrees; when the current yaw angle is 255-315 degrees, acquiring a target yaw angle of 270 degrees; and when the current yaw angle is 0-45 degrees or 315-360 degrees, acquiring the target yaw angle as 0 degree.
Therefore, the target yaw angle of the robot is associated with the current yaw angle through a preset mapping relation, and the target yaw angle can be correspondingly obtained according to the current yaw angle of the robot. .
In an exemplary embodiment, as shown in fig. 9, calculating the motion error amount of the robot performing the linear motion according to the position information and the yaw angle of the robot includes at least the steps of:
step 2531, respectively calculating the difference between the position information and the central position of the jigsaw puzzle and the difference between the yaw angle and the target yaw angle to obtain the position error and the first yaw angle error of the robot, wherein the target yaw angle is associated with the yaw angle through a preset mapping relation.
Wherein, because the current position information of the robot is obtained according to the content described in step 231, the current position error of the robot can be obtained by calculating the difference between the position information and the center position of the jigsaw where the robot is located.
When the robot moves along the X-axis direction on the jigsaw puzzle, the position error of the robot is reflected by the X-axis position coordinate, and the position error of the robot can be obtained by calculating the difference between the X-axis position coordinate and the central position coordinate of the jigsaw puzzle. Similarly, when the robot moves along the Y-axis direction on the jigsaw puzzle, the position error of the robot is reflected by the Y-axis position coordinate, and the position error of the robot can be obtained by calculating the difference between the Y-axis position coordinate and the central position coordinate of the jigsaw puzzle.
The current first yaw angle error of the robot can be obtained by calculating the difference between the current yaw angle of the robot and the target yaw angle. Please refer to the foregoing contents for the process of acquiring the current yaw angle and the target yaw angle of the robot, which is not described herein again.
Step 2533, converting the position error into a second yaw angle error according to a set conversion rule.
The position error is the deviation between the current position of the robot and the central position of the jigsaw puzzle, and the robot cannot translate in the correction process and needs to be corrected in the movement process after rotating for a certain angle, so that the position error of the robot needs to be converted into a second yaw angle error to correct the movement posture of the robot.
For example, the motion error amount of the robot may be represented by the formula σ ═ k1σpaWhere σ represents the amount of motion error, σpIndicating a position error, k1Indicating the set conversion rule, k1Is a set value determined based on actual debugging experience and correction effect, and is, for example, 0.9 k1Is dependent on the target yaw angle and the direction of motion, σ, of the robotaRepresenting a first yaw angle error.
As shown in fig. 3, the coordinate values of the positions of the puzzle along the coordinate axis gradually increase, the yaw angle of the robot gradually increases when the robot rotates left, the motion error amount also increases, the yaw angle of the robot gradually decreases when the robot rotates right, and the motion error amount also decreases1The setting manner of the positive and negative directions of (1) will be described.
If the target yaw angle is 0 degrees or 270 degrees and k is k under the advancing state of the robot1Taking the positive direction, if the target yaw angle is 90 degrees or 180 degrees, k1Taking a negative direction; if the target yaw angle is 0 degrees or 270 degrees and k is in a backward state of the robot1Taking the negative direction, if the target yaw angle is 90 degrees or 180 degrees, k1In the positive direction.
Thus, in determining k1After the positive and negative directions, calculating the position error and k of the robot1The product of the two is used to obtain a second yaw angle error.
Step 2535, obtaining a motion error amount by calculating a sum of the first yaw angle error and the second yaw angle error.
Still referring to the above calculation formula of the motion error amount, after the second yaw angle error is obtained, the sum of the first yaw angle error and the second yaw angle error is calculated by calculation, so as to obtain the motion error amount of the robot.
Therefore, the motion error amount comprehensively considers the current position error and the yaw angle error of the robot, and the current attitude error of the robot can be accurately described.
And 255, controlling the robot to correct the motion attitude in real time according to the numerical value of the motion error amount and the positive and negative directions, wherein the positive and negative directions indicate the motion attitude correction direction of the robot.
The numerical value of the motion error amount reflects the error degree of the current motion posture of the robot, and when the numerical value of the motion error amount is larger than a set value (for example, 5 °), the error degree indicating the current posture of the robot is high, and the motion posture needs to be corrected.
The positive and negative directions of the motion error quantity are used for indicating the direction for correcting the motion attitude of the robot so as to gradually reduce the numerical value of the motion error quantity of the robot in the motion process until the numerical value of the motion error quantity is smaller than a set value. Illustratively, the robot is controlled to rotate to the right when the motion error amount is greater than 5 °, and to rotate to the left when the motion error amount is less than-5 °.
In another exemplary embodiment, the robot stops the correction of the motion attitude when the value of the motion error amount is slightly smaller than a first set value in the process of correcting the motion attitude, wherein the first set value is smaller than the set value. For example, the first set value is 3 °, when the motion error amount is between-3 ° and 3 °, it indicates that the motion attitude error of the robot is small, and no correction is required, and the robot is controlled to continue to maintain the linear motion.
It should be noted that the value of the first set value is set to be slightly smaller than the set value, so as to avoid that the robot frequently switches between correction and linear motion when the yaw angle error is at a critical value.
It should be further noted that, for the road map, the motion posture of the robot is still corrected in real time according to the above-described process, which is not described herein again.
Therefore, the method provided by the embodiment can control the robot to correct the motion attitude in real time in the process of linear motion on the jigsaw puzzle, so that the robot can move along the central area of the grid map or the road area of the road map in the process of linear motion.
In another exemplary embodiment, when the robot performs linear motion in the jigsaw, the robot can also accurately stop at the central position of the jigsaw according to the acquired motion stop signal. As shown in fig. 10, in an exemplary embodiment, the method of controlling the robot to stop exactly at the center position of the puzzle during the linear movement includes at least the following steps:
in the linear motion performed, the robot acquires a triggered motion stop signal, and performs reading of the common boundary code in the map area code according to the motion stop signal, step 310.
Wherein the motion stop signal acquired by the robot can still be received by the remote control receiving module shown in fig. 5. After a user triggers a stop button on the remote controller, the remote controller generates a motion stop signal and sends the motion stop signal to the remote control receiving module, so that the robot obtains the motion stop signal.
In order to ensure that the robot stops at the central position of the jigsaw accurately, after the robot acquires the motion stop signal, the robot needs to read the common boundary code in the map area code through the photosensitive module. The purpose of the robot reading the common boundary code is to obtain a common boundary region of the tiles, which is an adjacent region to which each tile is stitched, and the common boundary region is used for locating a stopping place of the robot.
And step 330, if the common boundary code is read, controlling the robot to stop moving when the robot moves to the central position of the jigsaw puzzle, or calculating the moving distance after the common boundary code through an encoder configured by the robot, and controlling the robot to stop moving when the moving distance reaches half the length of the jigsaw puzzle.
If the robot moves to enter the public boundary area of the current jigsaw puzzle, the robot correspondingly reads the public boundary code, and the robot is controlled to stop moving when moving to the central position of the current jigsaw puzzle. And if the robot moves to move out of the common boundary area of the current jigsaw and still reads the common boundary code, controlling the robot to stop moving when moving to the central position of the next jigsaw.
It should be noted that, because the adjacent tiles respectively include the common boundary area, if the robot reads the common area code again in the continuous motion after reading the common boundary code, it indicates that the robot moves to move out of the common boundary area of the current tile when reading the common boundary area for the first time, otherwise, it is determined that the robot moves to enter the common boundary area of the current tile.
If the robot can read the position coordinates of the central position of the jigsaw puzzle, the robot is controlled to stop moving when the photosensitive module reads the position coordinates of the central position, and therefore the robot stops moving in an accurate place.
If the robot cannot read the position coordinates of the central position of the puzzle, the central position of the puzzle needs to be acquired by an encoder. Because the distance from the common boundary area of the jigsaw to the central position is half of the length of the jigsaw, and the length of each jigsaw is fixed, after the robot reads the common boundary code, the motion distance after the common boundary code is calculated through the encoder, and when the motion distance reaches half of the length of the jigsaw, the robot is controlled to stop moving, so that the robot can be guaranteed to stop at the central position of the jigsaw.
In an exemplary embodiment, the set stopping distance may be slightly greater than half the puzzle length, thereby avoiding premature robot stopping.
And 350, otherwise, calculating the movement distance of the robot after the robot acquires the movement stop signal through the encoder, and stopping the movement when the movement distance reaches the length of the jigsaw puzzle.
If map area codes arranged on the puzzles are worn or the puzzles are blank maps, the robot cannot read the public boundary codes, the motion distance of the robot after the robot acquires the motion stop signal is calculated through the encoder, and the robot stops moving when the motion distance reaches integral multiples of the length of the puzzles.
When the robot moves on a jigsaw with worn map area codes or moves on a blank map, the position where the robot starts to move is the center position of the jigsaw by default, the robot starts to measure the movement distance from the beginning of the movement, and after the movement stop signal is obtained, the robot is controlled to stop moving when the movement distance reaches the integral multiple of the length of the jigsaw, so that the robot is controlled to move according to the movement strategies of a grid map and a road map.
Therefore, the method provided by the embodiment can adapt to the motion stop strategy of the robot under different conditions, so that the robot is controlled to accurately stop at the central position of the jigsaw puzzle.
In another exemplary embodiment, as shown in fig. 11, when the robot performs the in-situ rotation motion in the central area of the puzzle, the process of correcting the motion pose of the robot in real time at least comprises the following steps:
and step 410, acquiring a yaw angle in real time through map area coding or motion attitude information during the in-situ rotation of the robot.
First, it should be noted that the rotation movement performed by the robot is also performed according to the triggered movement control signal. For example, in the embodiment shown in fig. 1, after the robot moves to the central position of the puzzle 3, the user controls the robot to rotate 90 degrees to the right at the central position by triggering the rotation button on the remote controller.
In step 430, a difference between the yaw angle and the target rotation angle included in the motion control signal is calculated.
And in the in-situ rotation of the robot, when the current yaw angle of the robot is correspondingly consistent with the target rotation angle, the rotation of the robot reaches the target posture, and the rotation can be stopped.
Since the target rotation angle generally refers to an angle value rotated in a certain direction, the yaw angle or the target rotation angle needs to be converted into each other before calculating the difference between the yaw angle and the target rotation angle. For example, a target yaw angle of the robot rotating to reach the target posture can be obtained according to the target rotation angle, and then a difference value between the current yaw angle and the target yaw angle of the robot during the rotation process is calculated.
Or, when the target rotation angle included in the motion control signal is the target yaw angle, directly calculating the difference between the current yaw angle and the target yaw angle of the robot.
And step 450, controlling the robot to stop rotating in place when the difference is smaller than a set threshold.
When the difference value between the current yaw angle and the target yaw angle of the robot is smaller than a set threshold value, the rotation of the robot is indicated to reach the target posture, and the robot is controlled to stop rotating in place.
In an exemplary embodiment, the current yaw angle of the robot is acquired by adopting dual constraint conditions of a photosensitive module and a micromechanical inertia module. The dual constraint condition means that the robot respectively acquires a current yaw angle in real time through the photosensitive module and the micro-mechanical inertia module, a difference value between the current yaw angle and a target yaw angle is calculated, and when the acquired difference value between the photosensitive module and any one of the micro-mechanical inertia modules is smaller than a set threshold value, the robot is considered to have rotated to a target position.
Therefore, the method provided by the embodiment can control the robot to accurately perform in-situ rotation movement.
In another exemplary embodiment, the motion control method of the robot further includes a process of correcting the motion speed of the robot in real time, where the process includes:
in the motion state, the robot corrects the motion speed error in real time according to the coding scale of the configured encoder, so that the motion speed of the robot is consistent with the target motion speed set in the motion control signal.
The motion speed error of the robot is the difference value between the real-time speed of the robot and the target speed contained in the motion control signal. As mentioned above, the real-time speed of the robot is obtained by the encoder scale values collected by the encoder in real time.
After the real-time speed of the robot is obtained through the encoder, the PWM value of the driving motor is adjusted according to the difference value between the real-time speed and the target speed, so that the moving speed of the robot is adjusted to be consistent with the target speed, and the robot moves on the jigsaw according to the target speed set by the movement control signal.
Fig. 12 is a diagram illustrating a motion control device of a robot according to an exemplary embodiment. It should be noted that the motion of the robot is performed on a plurality of mosaics which are spliced, at least one of the mosaics is divided into a plurality of map areas, the map areas contain map area codes, and the robot is configured with a photosensitive module and a micro-mechanical inertial module.
As shown in fig. 12, the apparatus shown in this exemplary embodiment includes a measurement error correction module 510, a motion state switching module 530, and a motion attitude correction module 550.
The measurement error correction module 510 is configured to, in a static state, control the robot to correct the measurement error of the micro-mechanical inertial element in the micro-mechanical inertial module according to the static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module.
The motion state switching module 530 is configured to control the robot to switch from a stationary state to a motion state according to a triggered motion control signal, where the motion control signal indicates that the robot moves through a designated map area.
The motion attitude correction module 550 is configured to correct the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertial module, where the motion attitude information is obtained by performing attitude calculation on the measurement result output by the micro-mechanical inertial element by the micro-mechanical inertial module.
In another exemplary embodiment, the measurement error correction module 510 includes a first information acquisition unit, a measurement error acquisition unit, and an error correction unit.
The first information acquisition unit is used for controlling the robot to acquire static attitude information output by the micro-mechanical inertia module and map area codes read by the photosensitive module in a static state.
The measurement error acquisition unit is used for calculating the difference value of the static attitude information and the map area code and filtering the calculation result to obtain the measurement error of the micro-mechanical inertial element in the micro-mechanical inertial module.
The error correction unit is used for calculating the difference value between the information obtained by the measurement of the micro-mechanical inertial element and the measurement error, and obtaining the difference value as the measurement result output by the micro-mechanical inertial element.
In another exemplary embodiment, the motion state switched by the robot includes that the robot performs a linear motion through a designated map area, and the motion posture correction module 550 includes a second information acquisition unit, a motion error amount acquisition unit, and a motion control unit.
The second information acquisition unit is used for controlling the robot to acquire position information and a yaw angle on the jigsaw puzzle according to the map area code or the motion attitude information.
And the motion error amount acquisition unit is used for calculating the motion error amount of the robot performing linear motion through the position information and the yaw angle.
And the motion control unit is used for controlling the robot to correct the motion attitude in real time according to the numerical value of the motion error amount and the positive and negative directions, and the positive and negative directions indicate the motion attitude correction direction of the robot.
In another exemplary embodiment, the motion error amount acquisition unit includes an information calculation subunit, an error conversion subunit, and an error amount calculation subunit.
The information calculation subunit is used for respectively calculating the difference between the position information and the central position of the jigsaw puzzle and the difference between the yaw angle and the target yaw angle to obtain the position error and the first yaw angle error of the robot, and the target yaw angle is associated with the yaw angle through a preset mapping relation.
And the error conversion subunit is used for converting the position error into a second yaw angle error according to a set conversion rule.
The error amount sub-unit is used for obtaining the motion error amount by calculating the sum of the first yaw angle error and the second yaw angle error.
In another exemplary embodiment, the motion posture correction module 550 further includes a motion stop signal processing unit, a first motion control unit, and a second motion control unit.
The motion stop signal processing unit is used for controlling the robot to acquire a triggered motion stop signal in the linear motion, and reading the common boundary code in the map area code according to the motion stop signal.
The first motion control unit is used for controlling the robot to stop moving when the robot moves to the central position of the jigsaw under the condition that the common boundary area is read, or calculating the motion distance after the common boundary is coded through an encoder configured on the robot, and controlling the robot to stop moving when the motion distance reaches half of the length of the jigsaw.
And the second motion control unit is used for calculating the motion distance of the robot after the robot acquires the motion stop signal through the encoder under the condition that the common boundary area is not read, and stopping the motion when the motion distance reaches the integral multiple of the jigsaw puzzle length.
In another exemplary embodiment, the motion state switched by the robot further includes that the robot performs in-place rotation motion on a designated map area, and the motion posture correction module 550 includes a third information acquisition unit, a yaw angle difference calculation unit, and a rotation control unit.
And the third information acquisition unit is used for controlling the robot to acquire a yaw angle in real time through map area coding or motion attitude information in the in-situ rotation motion.
The yaw angle difference calculation unit is used for calculating the difference between the yaw angle and the target rotation angle contained in the motion control signal.
And the rotation control unit is used for controlling the robot to stop the in-situ rotation motion when the difference value is smaller than a set threshold value.
It should be noted that the apparatus provided in the foregoing embodiment and the method provided in the foregoing embodiment belong to the same concept, and the specific manner in which each module performs operations has been described in detail in the method embodiment, and is not described again here.
In an exemplary embodiment, a robot includes a processor and a memory, wherein the memory is used for storing executable instructions of the processor, and the processor is configured to execute the motion control method of the robot described in any one of the above exemplary embodiments by executing the executable instructions.
In an exemplary embodiment, a computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements a motion control method of a robot described in any of the above exemplary embodiments.
The above description is only a preferred exemplary embodiment of the present application, and is not intended to limit the embodiments of the present application, and those skilled in the art can easily make various changes and modifications according to the main concept and spirit of the present application, so that the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A motion control method of a robot is characterized in that the motion of the robot is performed on a plurality of spliced puzzles, at least one puzzle is divided into a plurality of map areas, the map areas contain map area codes, the robot is provided with a photosensitive module and a micro-mechanical inertia module, and the method comprises the following steps:
in a static state, the robot corrects the measurement error of a micro-mechanical inertial element in the micro-mechanical inertial module according to static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module;
according to a triggered motion control signal, the robot is switched from the static state to a motion state, and the motion control signal indicates that the robot moves through the designated map area;
and correcting the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module, wherein the motion attitude information is obtained by performing attitude calculation on the measurement result output by the micro-mechanical inertia module to the micro-mechanical inertia element.
2. The method according to claim 1, wherein in the static state, the robot corrects the measurement error of the micromechanical inertial element in the micromechanical inertial module according to the static attitude information output by the micromechanical inertial module and the map area code read by the photosensitive module, and the method comprises:
in a static state, the robot acquires static attitude information output by the micro-mechanical inertia module and a map area code read by the photosensitive module;
calculating a difference value between the static attitude information and the map area code, and filtering a calculation result to obtain a measurement error of a micro-mechanical inertial element in the micro-mechanical inertial module;
and calculating the difference value between the information obtained by the measurement of the micro-mechanical inertia element and the measurement error, and obtaining the difference value as the measurement result output by the micro-mechanical inertia element.
3. The method according to claim 1, wherein the correcting the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micromechanical inertial module comprises:
when the photosensitive module reads the map area code, the robot corrects the motion posture of the robot in real time according to the map area code;
otherwise, the robot executes real-time correction of the motion attitude according to the motion attitude information output by the micro-mechanical inertia module.
4. The method according to claim 1, wherein the motion state includes that the robot performs linear motion via the designated map area, and the real-time correction of the motion attitude of the robot according to the map area code read by the photosensitive module or the motion attitude information output by the micromechanical inertial module includes:
the robot acquires position information and a yaw angle on the jigsaw according to the map area code or the motion attitude information;
calculating a motion error amount of the robot performing the linear motion through the position information and the yaw angle;
and controlling the robot to correct the motion attitude in real time according to the numerical value and the positive and negative directions of the motion error quantity, wherein the positive and negative directions indicate the motion attitude correction direction of the robot.
5. The method according to claim 4, wherein the calculating of the motion error amount of the robot performing the linear motion by using the position information and the yaw angle includes:
respectively calculating the difference between the position information and the central position of the jigsaw puzzle and the difference between the yaw angle and a target yaw angle to obtain a position error and a first yaw angle error of the robot, wherein the target yaw angle is associated with the yaw angle through a preset mapping relation;
converting the position error into a second yaw angle error according to a set conversion rule;
the motion error amount is obtained by calculating a sum of the first yaw angle error and the second yaw angle error.
6. The method of claim 4, further comprising:
in the linear motion, the robot acquires a triggered motion stop signal and reads a common boundary code in the map area code according to the motion stop signal;
if the common boundary code is read, controlling the robot to stop moving when the robot moves to the central position of the jigsaw puzzle; or calculating a movement distance after the common boundary coding through an encoder configured by the robot, and controlling the robot to stop moving when the movement distance reaches half of the length of the jigsaw puzzle;
otherwise, calculating the movement distance of the robot after the robot acquires the movement stop signal through the encoder, and stopping the movement when the movement distance reaches the integral multiple of the puzzle length.
7. The method according to claim 1, wherein the motion state includes that the robot performs in-situ rotation motion on a designated map area, and the motion attitude of the robot is corrected in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micromechanical inertial module, including:
in the in-situ rotation motion of the robot, a yaw angle is obtained in real time through the map area code or the motion attitude information;
calculating a difference value between the yaw angle and a target rotation angle contained in the motion control signal;
and when the difference value is smaller than a set threshold value, controlling the robot to stop the in-situ rotation motion.
8. The method of claim 1, further comprising:
in the motion state, the robot corrects the motion speed error in real time according to the coding scale of the configured coder, so that the motion speed of the robot is consistent with the target motion speed set in the motion control signal.
9. A motion control device of a robot, wherein the motion of the robot is performed on a plurality of jigsaw puzzles which are spliced, at least one jigsaw is divided into a plurality of map areas, and the map areas contain map area codes, the robot is provided with a photosensitive module and a micro-mechanical inertia module, and the device comprises:
the measuring error correction module is used for controlling the robot to correct the measuring error of a micro-mechanical inertial element in the micro-mechanical inertial module according to the static attitude information output by the micro-mechanical inertial module and the map area code read by the photosensitive module in a static state;
the motion state switching module is used for controlling the robot to be switched from the static state to the motion state according to a triggered motion control signal, and the motion control signal indicates the robot to move through the designated map area;
and the motion attitude correction module is used for correcting the motion attitude of the robot in real time according to the map area code read by the photosensitive module or the motion attitude information output by the micro-mechanical inertia module, wherein the motion attitude information is obtained by performing attitude calculation on the measurement result output by the micro-mechanical inertia element by the micro-mechanical inertia module.
10. A robot, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 8 via execution of the executable instructions.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 8.
CN201910579972.3A 2019-06-28 2019-06-28 Robot motion control method and device, robot and storage medium Active CN112147995B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910579972.3A CN112147995B (en) 2019-06-28 2019-06-28 Robot motion control method and device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910579972.3A CN112147995B (en) 2019-06-28 2019-06-28 Robot motion control method and device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN112147995A true CN112147995A (en) 2020-12-29
CN112147995B CN112147995B (en) 2024-02-27

Family

ID=73891609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910579972.3A Active CN112147995B (en) 2019-06-28 2019-06-28 Robot motion control method and device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN112147995B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113029200A (en) * 2021-03-29 2021-06-25 上海景吾智能科技有限公司 Method, system and medium for testing course angle and accuracy based on robot sensor
CN114571482A (en) * 2022-03-30 2022-06-03 长沙朗源电子科技有限公司 Drawing robot system and control method of drawing robot
CN117086868A (en) * 2023-08-09 2023-11-21 北京小米机器人技术有限公司 Robot, control method and device thereof, and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060293810A1 (en) * 2005-06-13 2006-12-28 Kabushiki Kaisha Toshiba Mobile robot and a method for calculating position and posture thereof
CN205049153U (en) * 2015-09-24 2016-02-24 北京理工大学 Sustainable navigation data collection system of vehicle under environment of photoelectric type GPS blind area
CN106382934A (en) * 2016-11-16 2017-02-08 深圳普智联科机器人技术有限公司 High-precision moving robot positioning system and method
CN106647814A (en) * 2016-12-01 2017-05-10 华中科技大学 System and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification
CN106780325A (en) * 2016-11-29 2017-05-31 维沃移动通信有限公司 A kind of picture joining method and mobile terminal
CN107782305A (en) * 2017-09-22 2018-03-09 郑州郑大智能科技股份有限公司 A kind of method for positioning mobile robot based on digital alphabet identification
CN108592906A (en) * 2018-03-30 2018-09-28 合肥工业大学 AGV complex navigation methods based on Quick Response Code and inertial sensor
CN108592914A (en) * 2018-04-08 2018-09-28 河南科技学院 The positioning of complex region inspecting robot, navigation and time service method under no GPS scenario
US20190005669A1 (en) * 2016-03-09 2019-01-03 Guangzhou Airob Robot Technology Co., Ltd. Method And Apparatus For Map Constructing And Map Correcting
CN109752003A (en) * 2018-12-26 2019-05-14 浙江大学 A kind of robot vision inertia dotted line characteristic positioning method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060293810A1 (en) * 2005-06-13 2006-12-28 Kabushiki Kaisha Toshiba Mobile robot and a method for calculating position and posture thereof
CN205049153U (en) * 2015-09-24 2016-02-24 北京理工大学 Sustainable navigation data collection system of vehicle under environment of photoelectric type GPS blind area
US20190005669A1 (en) * 2016-03-09 2019-01-03 Guangzhou Airob Robot Technology Co., Ltd. Method And Apparatus For Map Constructing And Map Correcting
CN106382934A (en) * 2016-11-16 2017-02-08 深圳普智联科机器人技术有限公司 High-precision moving robot positioning system and method
CN106780325A (en) * 2016-11-29 2017-05-31 维沃移动通信有限公司 A kind of picture joining method and mobile terminal
CN106647814A (en) * 2016-12-01 2017-05-10 华中科技大学 System and method of unmanned aerial vehicle visual sense assistant position and flight control based on two-dimensional landmark identification
CN107782305A (en) * 2017-09-22 2018-03-09 郑州郑大智能科技股份有限公司 A kind of method for positioning mobile robot based on digital alphabet identification
CN108592906A (en) * 2018-03-30 2018-09-28 合肥工业大学 AGV complex navigation methods based on Quick Response Code and inertial sensor
CN108592914A (en) * 2018-04-08 2018-09-28 河南科技学院 The positioning of complex region inspecting robot, navigation and time service method under no GPS scenario
CN109752003A (en) * 2018-12-26 2019-05-14 浙江大学 A kind of robot vision inertia dotted line characteristic positioning method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
汪思迪 等: "基于惯性导航、RFID及图像识别的AGV融合导航***", 起重运输机械, no. 08, pages 81 - 84 *
黎永键 等: "MEMS惯性传感器ADIS16355在姿态测量中的应用", 数据采集与处理, no. 04, pages 501 - 507 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113029200A (en) * 2021-03-29 2021-06-25 上海景吾智能科技有限公司 Method, system and medium for testing course angle and accuracy based on robot sensor
CN114571482A (en) * 2022-03-30 2022-06-03 长沙朗源电子科技有限公司 Drawing robot system and control method of drawing robot
CN114571482B (en) * 2022-03-30 2023-11-03 长沙朗源电子科技有限公司 Painting robot system and control method of painting robot
CN117086868A (en) * 2023-08-09 2023-11-21 北京小米机器人技术有限公司 Robot, control method and device thereof, and storage medium
CN117086868B (en) * 2023-08-09 2024-04-09 北京小米机器人技术有限公司 Robot, control method and device thereof, and storage medium

Also Published As

Publication number Publication date
CN112147995B (en) 2024-02-27

Similar Documents

Publication Publication Date Title
CN103033184B (en) Error correction method, device and system for inertial navigation system
CN108731673B (en) Autonomous navigation positioning method and system for robot
US5983166A (en) Structure measurement system
CN102598896B (en) Object control system, object control method and program, and rotational center position specification device
CN109387194B (en) Mobile robot positioning method and positioning system
US11441900B2 (en) Movable marking system, controlling method for movable marking apparatus, and computer readable recording medium
CN112147995B (en) Robot motion control method and device, robot and storage medium
JP2012003706A (en) Unmanned running vehicle guiding device and unmanned running vehicle guiding method
CN110850882A (en) Charging pile positioning method and device of sweeping robot
EP3438605B1 (en) Surveying device
KR100564236B1 (en) Self-localization apparatus and method of mobile robot
CN111174696A (en) Laser-assisted calibration method and device based on CCD sensor
TW202116144A (en) Method and module for displacement processing and mower device
CN116382315B (en) Picture construction method and system thereof, underwater robot, storage medium and electronic equipment
CN103542864B (en) A kind of inertial navigation fall into a trap step method and device
CN113960890A (en) Motion component control method in laser imaging equipment and related equipment
KR102683350B1 (en) Control system, control method and computer readable storage medium
CN112276934A (en) Control method, control device, tile paving system, storage medium and processor
CN114789439B (en) Slope positioning correction method, device, robot and readable storage medium
JP6734764B2 (en) Position estimation device, map information preparation device, moving body, position estimation method and program
CN114322918B (en) Method and device for detecting movable equipment state and computer readable storage medium
JP2021163277A (en) Position detection system
WO2020222790A1 (en) Positioning autonomous vehicles
US20240083036A1 (en) Method and apparatus for robot system management
CN108491905A (en) A kind of method for positioning mobile robot, system, medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant