CN113712473A - Height calibration method and device and robot - Google Patents
Height calibration method and device and robot Download PDFInfo
- Publication number
- CN113712473A CN113712473A CN202110859722.2A CN202110859722A CN113712473A CN 113712473 A CN113712473 A CN 113712473A CN 202110859722 A CN202110859722 A CN 202110859722A CN 113712473 A CN113712473 A CN 113712473A
- Authority
- CN
- China
- Prior art keywords
- height
- depth image
- robot
- ground
- acquisition module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The embodiment of the application discloses a height calibration method, a height calibration device and a robot, and is applied to the technical field of robots. The method comprises the following steps: when the robot is horizontally placed on the ground, the height, from the depth image acquisition module to the ground, detected by the depth image acquisition module is acquired, the height, from the robot to the ground, is acquired as a first height based on the height, from the depth image acquisition module to the ground, and a preset height difference, the height, from the robot to the ground, detected by the infrared detection module is acquired as a second height, and the second height is calibrated based on the first height. According to the embodiment of the application, the robot is highly calibrated from the first height to the ground through the first height obtained by the height calculation based on the depth image acquisition module, the robot directly obtained by the infrared detection module is highly calibrated from the second height to the ground, the robot can accurately operate, and the accuracy of height measurement is improved.
Description
Technical Field
The present disclosure relates to the field of robotics, and more particularly, to a height calibration method and apparatus, and a robot.
Background
With the development of smart homes, the robot technology is rapidly developed, and the robot becomes one of indispensable electronic products in daily life of people. In order to avoid falling from a high place, the sweeping robot in the current market can continuously detect the distance from the robot to the ground in the running process. At present, the main distance detection mode is to use an infrared diode to perform emission and receiving calculation, but due to objects with different colors and different materials, the absorption and reflectivity of infrared signals are different, which further causes misjudgment during detection of the robot and causes abnormal operation of the robot in serious cases.
Disclosure of Invention
In view of the above problems, the present application provides a height calibration method, apparatus and robot to solve the above problems.
In a first aspect, an embodiment of the present application provides a height calibration method, which is applied to a robot, where the robot includes a depth image acquisition module and an infrared detection module, the infrared detection module is installed at a bottom of the robot, the depth image acquisition module is installed on the robot and has a preset height difference with the infrared detection module, and the method includes: when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired; obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference; acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height; calibrating the second height based on the first height.
In a second aspect, an embodiment of the present application provides a height calibration device, which is applied to a robot, where the robot includes a depth image collecting module and an infrared detection module, the infrared detection module is installed at the bottom of the robot, the depth image collecting module is installed on the robot and has a preset height difference with the infrared detection module, the device includes: the height acquisition module is used for acquiring the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, when the robot is horizontally placed on the ground; the first height acquisition module is used for acquiring the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference; the second height acquisition module is used for acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height; a calibration module to calibrate the second height based on the first height.
In a third aspect, embodiments of the present application provide a robot, including a memory and a processor, the memory being coupled to the processor, the memory storing instructions, the processor performing the above method when the instructions are executed by the processor.
According to the height calibration method, the height calibration device and the robot, when the robot is horizontally placed on the ground, the height from the depth image acquisition module to the horizontal ground, which is detected by the depth image acquisition module, is obtained, the height from the robot to the ground is obtained as a first height according to the height from the depth image acquisition module to the ground and a preset height difference existing between the robot and the infrared detection module when the depth image acquisition module is installed on the robot, the height from the robot to the ground is obtained as a second height, the height from the robot to the ground, which is detected by the infrared detection module, is obtained as a second height, and finally the second height is calibrated based on the first height, so that the robot can accurately run, and the accuracy of height measurement is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an application environment suitable for the height calibration method provided by the embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a height calibration method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a height calibration method according to another embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating a height calibration method according to yet another embodiment of the present application;
FIG. 5 is a flow chart illustrating a height calibration method provided in yet another embodiment of the present application;
FIG. 6 is a flow chart illustrating a method of height calibration according to yet another embodiment of the present application;
FIG. 7 shows a flow diagram of a height calibration method provided by yet another embodiment of the present application;
fig. 8 shows a block diagram of a height calibration apparatus provided in an embodiment of the present application.
FIG. 9 shows a block diagram of a robot for performing a height calibration method according to an embodiment of the present application;
fig. 10 illustrates a memory unit for storing or carrying program code implementing a height calibration method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
With the development of smart homes, the robot technology is rapidly developed, the application variety of the robot is wide, and the robot becomes one of indispensable electronic products in daily life of people. At present, people use the sweeping robot most frequently in the household life, and the sweeping robot on the current market can continuously detect the distance from the robot to the ground in the running process of the sweeping robot in order to avoid falling from a high place. At present, the main distance detection mode is to use an infrared diode for transmitting and receiving, but due to the fact that objects with different colors and different materials exist in the mode, differences exist between absorption and reflectivity of the objects to infrared signals, misjudgment during detection of the robot is further caused, and abnormal operation of the robot is caused in serious cases.
Therefore, in view of the above technical problems, the inventors have found and proposed a height calibration method, device and robot through long-term research, and by calculating a first height of the robot from the ground based on the height obtained by the depth image acquisition module, and by calibrating a second height of the robot from the ground directly obtained by the infrared detection module through the first height, the robot can be accurately operated, and the accuracy of height measurement is improved. The specific height calibration method is described in detail in the following examples.
An application scenario applicable to the height calibration method provided in the embodiment of the present application is described below.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an application scenario that can be used in the height calibration method provided in the embodiment of the present application. Robot 100 includes depth image acquisition module 130 and infrared detection module 140, infrared detection module 140 installs in the bottom of robot 100, depth image acquisition module 130 install robot 100 go up and with there is the difference in height that predetermines in infrared detection module 140. The depth image acquisition module 130 acquires a depth image to obtain a distance between the depth image acquisition module 130 and the ground, so that a first height is calculated based on a preset height difference, the infrared detection module 140 acquires a second height, and finally the second height is calibrated based on a size relationship between the first height and the second height.
Referring to fig. 2, fig. 2 is a flow chart illustrating a height calibration method according to an embodiment of the present application. According to the height calibration method, the first height from the ground of the robot is obtained through calculation based on the height obtained by the depth image acquisition module, and the second height from the ground of the robot directly obtained by the infrared detection module is calibrated through the first height, so that the robot can accurately run, and the accuracy of height measurement is improved. In a specific embodiment, the height calibration method is applied to the height calibration apparatus 200 shown in fig. 8 and the robot 100 (fig. 9) equipped with the height calibration apparatus 200. The following will describe a specific flow of the present embodiment by taking a robot as an example, and it should be understood that the robot applied in the present embodiment may be a sweeping robot, a transmission robot, an identification robot, and the like, which can move on the ground, and is not limited herein. As will be explained in detail with respect to the flow shown in fig. 2, the height calibration method may specifically include the following steps:
step S110: when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired.
In this embodiment, the depth image acquisition module may be a depth camera, wherein the depth camera includes, but is not limited to, a structured light depth camera, a binocular vision depth camera, and a light time-of-flight depth camera.
In some embodiments, when the depth image acquisition module is a depth camera, the height of the depth image acquisition module from the ground surface obtained by the depth camera is used. For example, when the depth camera is a structured light depth camera, the depth image is acquired using the structured light depth camera; when the depth camera is a coded light depth camera, the coded light depth camera is used for acquiring a depth image; when the depth camera is a speckle optical depth camera, the depth image is acquired using the speckle optical depth camera.
In other embodiments, when the depth acquisition module is a structured light depth camera, the structured light depth camera includes, but is not limited to, a stripe structured light depth camera, a coded structured light depth camera, and a speckle structured light depth camera, which is not limited herein. For example, when the structured light depth camera is a stripe structured light depth camera, then a stripe structured light is used to capture the depth image; when the structured light depth camera is a coded structured light depth camera, using coded structured light to acquire a depth image; when the structured light depth camera is a speckle structured light depth camera, the speckle structured light is used to acquire the depth image.
In still other embodiments, the depth image acquired by the depth image acquisition module may be a color image, a grayscale image, or a binary image, which is not limited herein.
As an embodiment, the depth image acquisition module obtains a depth image, decomposes image pixels based on the obtained depth image, and obtains information based on the pixels, where the pixels include, but are not limited to, color pixels, grayscale pixels, and binary pixels. For example, when the pixel point is a color pixel point, the information corresponding to the color pixel point is obtained by decomposition; when the pixel points are gray pixel points, decomposing to obtain information corresponding to the gray pixel points; and when the pixel point is a binary pixel point, decomposing to obtain information corresponding to the binary pixel point. The information includes, but is not limited to, the volume of the object in the image, the shape of the object in the image, and the distance between the depth camera and the pixel point.
In the embodiment, the depth image acquisition module acquires a depth image, the image is decomposed based on the acquired depth image, and the height of the image acquisition module from the horizontal ground is acquired based on the pixel points.
Step S120: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
In some embodiments, the robot is preset and stores a preset height difference between the depth image acquisition module and the infrared detection module, wherein the infrared detection module is installed at the bottom of the robot. Therefore, after the robot obtains the height from the depth image acquisition module to the horizontal ground, the height from the depth image acquisition module to the horizontal ground and the preset height difference are calculated, and the height from the robot to the ground is obtained and serves as the first height. Therefore, it can be understood that the preset height difference is a certain value.
Step S130: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
In some embodiments, the infrared detection module includes, but is not limited to, an infrared diode, a hand-held laser infrared distance meter, a telescope laser infrared distance meter, and the like, and the infrared detection module is not limited herein.
In other embodiments, the infrared detection module is used to directly measure the intensity of the reflected infrared signal received by the infrared detection module, so as to directly obtain the height from the robot to the ground as the second height. For example, when the infrared detection module is an infrared diode, the infrared diode emits an infrared signal and receives the reflected infrared signal to obtain a second height; when the infrared detection module is a handheld laser infrared distance meter, the handheld laser infrared distance meter transmits an infrared signal and receives the reflected infrared signal so as to obtain a second height; when the infrared detection module is the telescope type laser infrared distance meter, the telescope type laser infrared distance meter transmits infrared signals and receives the reflected infrared signals, so that the second height is obtained.
Step S140: calibrating the second height based on the first height.
In this embodiment, since the first height is determined based on the parameters collected by the depth image collection module, and the depth image collection module is less affected by the environment when collecting the parameters, for example, the depth image collection module does not have detection differences due to objects with different colors and different materials, that is, the accuracy of the parameters collected by the depth image collection module is higher, and accordingly, the first height determined based on the parameters collected by the depth image collection module is higher in reliability, so that the first height can be regarded as an accurate value.
In this embodiment, since the second height is determined based on the parameters collected by the infrared detection module, and the infrared detection module is greatly influenced by the environment when collecting the parameters, for example, the infrared detection module may have detection differences due to objects with different colors and different materials, that is, the accuracy of the parameters collected by the infrared detection module is low, and accordingly, the reliability of the second height determined based on the parameters collected by the infrared detection module is low, so that the second height can be regarded as a measurement value.
In some embodiments, there are 3 cases before the calibration, the first height is greater than the second height, the first height is less than the second height, and the first height is equal to the second height, which is not limited herein.
As another embodiment, after obtaining the first height and the second height, the first height may be regarded as an accurate value and the second height as a measured value to calibrate the second height based on the first height. The calibration method includes, but is not limited to, performing addition or subtraction calculation, multiplication and division calculation on the second height based on the first height. For example, when the calibration method is an addition-subtraction calculation, a value is added or subtracted to the second height to approximate the value of the first height; when the calibration method is a multiplication-division calculation, the second height is multiplied or divided by a value such that the processed value approaches the value of the first height.
In some embodiments, calibrating the second height based on the first height may include: the method comprises the steps of obtaining a size relation between a first height and a second height, determining a calibration parameter based on the size relation, calculating the second height based on the calibration parameter, and obtaining the calibrated second height, wherein the height difference between the calibrated second height and the first height is smaller than a preset height difference, namely the calibrated second height is close to the first height.
As a first mode, when the first height is greater than the second height, that is, when the height of the robot from the ground, which is detected by the depth image acquisition module, is greater than the height of the robot from the opposite side, which is detected by the infrared detection module, a first calibration parameter may be determined based on a magnitude relationship between the first height and the second height, and then the calibrated second height may be obtained by adding the first calibration parameter to the second height. For example, assuming that the first height is 100 and the second height is 50, the first calibration parameter may be 50, and the calibrated second height 100 may be obtained by adding the first calibration parameter 50 to the second height 50, such that the calibrated second height 100 is close to (equal to) the first height 100.
As a second way, when the first height is greater than the second height, that is, when the height of the robot from the ground detected by the depth image acquisition module is greater than the height of the robot from the opposite side detected by the infrared detection module, the second calibration parameter may be determined based on the magnitude relationship between the first height and the second height, and then the calibrated second height may be obtained by multiplying the second height by the second calibration parameter. For example, assuming that the first height is 100 and the second height is 50, the second calibration parameter may be 2, and the calibrated second height 100 may be obtained by multiplying the second height 50 by the first calibration parameter 2, so that the calibrated second height 100 approaches (is equal to) the first height 100.
As a third mode, when the first height is smaller than the second height, that is, when the height of the robot from the ground, which is detected by the depth image acquisition module, is smaller than the height of the robot from the opposite side, which is detected by the infrared detection module, a third calibration parameter may be determined based on the magnitude relationship between the first height and the second height, and then the calibrated second height may be obtained by subtracting the first calibration parameter from the second height. For example, assuming that the first height is 50 and the second height is 100, the third calibration parameter may be 50, and the calibrated second height 50 may be obtained by subtracting the third calibration parameter 50 from the second height 100, such that the calibrated second height 50 is close to (equal to) the first height 50.
As a fourth mode, when the first height is smaller than the second height, that is, when the height of the robot from the ground detected by the depth image acquisition module is smaller than the height of the robot from the opposite side detected by the infrared detection module, a fourth calibration parameter may be determined based on a magnitude relationship between the first height and the second height, and then the calibrated second height may be obtained by dividing the second height by the first calibration parameter. For example, assuming that the first height is 50 and the second height is 100, the fourth calibration parameter may be 2, and the calibrated second height 50 may be obtained by dividing the second height 100 by the fourth calibration parameter 2, such that the calibrated second height 50 is close to (equal to) the first height 50.
In the embodiment, the robot calibrates the second height based on the first height according to the correlation between the first height and the second height.
According to the height calibration method provided by one embodiment of the application, the robot acquires the depth image according to the depth image acquisition module, the height from the depth image acquisition module to the ground is acquired based on the depth image, so that the first height is calculated according to the acquired height and the preset height difference, the second height from the robot to the ground is directly acquired through the infrared detection module, the second height is calibrated on the basis of the first height, and the accuracy of height measurement is improved.
Referring to fig. 3, fig. 3 is a schematic flow chart illustrating a height calibration method according to another embodiment of the present application. The method is applied to a robot, and will be described in detail with respect to the flow shown in fig. 3, and the height calibration method may specifically include the following steps:
step S210: and when the robot is horizontally placed on the ground, acquiring a depth image acquired by a depth image acquisition module.
In some embodiments, the depth image capturing module may be a color depth camera, a grayscale depth camera, or a binary depth camera, which is not limited herein. For example, when the depth image acquisition module is a color depth camera, the acquired depth image is a color depth image; when the depth image acquisition module is a gray level depth camera, the acquired depth image is a gray level depth image; when the depth image acquisition module is a binary depth camera, the acquired depth image is a binary depth image.
Step S220: and obtaining the height of the depth image acquisition module from the ground based on the depth image.
As an embodiment, after the depth image is acquired, the acquired depth image is subjected to image processing. The image processing includes, but is not limited to, image segmentation, image pixel extraction, image sharpening, and the like. In this embodiment, the depth image pixel points are extracted from the acquired depth image, and the height from the depth image acquisition module to the horizontal ground is acquired based on the image pixel points.
In some embodiments, the depth image exists in a plurality of formats, such as a color depth image, a gray depth image, and a binary depth image, without limitation; when the depth image format is a color depth image, extracting color pixel points to obtain the height; when the depth image format is a gray depth image, extracting gray pixel points to obtain the height; and when the depth image format is a binary depth image, extracting binary pixel points to acquire the height between the depth image acquisition module and the ground.
Step S230: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
Step S240: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
Step S250: calibrating the second height based on the first height.
For the detailed description of steps S230 to S250, refer to steps S120 to S140, which are not described herein again.
According to the height calibration method provided by the embodiment of the application, the robot acquires a depth image according to the depth image acquisition module, pixel points corresponding to the image are extracted based on the depth image, the height from the depth image acquisition module to the horizontal ground is acquired based on the information of the pixel points, the first height is calculated according to the acquired height and the preset height difference, the second height from the robot to the ground is directly acquired through the infrared detection module, the second height is calibrated on the basis of the first height, and the accuracy and the rapidness of distance measurement are improved based on the distance measurement of the image pixels.
Referring to fig. 4, fig. 4 is a schematic flow chart illustrating a height calibration method according to still another embodiment of the present application. The method is applied to a robot, and will be described in detail with respect to the flow shown in fig. 4, and the height calibration method may specifically include the following steps:
step S310: and when the robot is horizontally placed on the ground, acquiring a depth image acquired by a depth image acquisition module.
For detailed description of step S310, please refer to step S210, which is not described herein again.
Step S320: and acquiring a connecting line between the optical center of the depth image acquisition module and the imaging center of the depth image, and acquiring an included angle between the connecting line and the ground.
In some embodiments, the depth image capture module may be a color depth camera, a grayscale depth camera, a binary depth camera, or a structured light depth camera, a binocular vision depth camera, and a light time of flight depth camera, which are not limited herein.
In one embodiment, when the depth image capture module is a depth camera, the location of the optical center of the lens of the depth camera is determined.
In other embodiments, the area of the imaging center point of the depth image may be the entire depth image, or may be a specific area designated in the depth image. For example, when the area is the whole depth image, the imaging center point is the center of the depth image, and if the depth image is a rectangular image, the center point is the intersection point of rectangular diagonals; if the depth image is a circular image, the center point is the center of a circle. For another example, when the region is a specific region designated by the depth image, if the designated image region is rectangular, the imaging center point is the intersection point of the diagonals of the rectangle; if the designated image area is circular, the imaging center point is the center of the circle; and if the established image area image is an irregular polygon, determining the imaging center of the depth image according to the specific situation of diagonal line connection.
As another embodiment, the optical center of the depth image acquisition module is connected to the imaging center of the depth image, and an included angle between the connection line and the ground is obtained.
Step S330: and acquiring the distance between the depth image acquisition module and the imaging center based on the depth image.
For the detailed description of step S330, please refer to step S220, which is not described herein again.
Step S340: and obtaining the height of the depth image acquisition module from the ground based on the distance and the included angle.
In this embodiment, after obtaining a specific numerical value of a connection line distance between an optical center of the depth image acquisition module and an imaging center of the depth image and an included angle between the connection line and the horizontal ground, the formula H is based on0And calculating to obtain the height of the robot from the ground as a first height.
Wherein, H0And the first height is L, sin (alpha) is the height from the depth image acquisition module to the ground, h is the preset height difference, L is the distance, and alpha is the included angle.
Step S350: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
Step S360: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
Step S370: calibrating the second height based on the first height.
For detailed description of steps S350 to S370, refer to steps S120 to S140, which are not described herein again.
In the height calibration method provided in another embodiment of the present application, the robot acquires the depth image according to the depth image acquisition module, acquires a connection line between an optical center of the depth image acquisition module and an imaging center of the depth image, and acquiring an included angle between the connecting line and the horizontal ground, acquiring the distance between the depth image acquisition module and the imaging center based on the depth image, the height of the depth image acquisition module from the ground is obtained according to the distance and the included angle, so that a first height is obtained through calculation according to the obtained height and a preset height difference, a second height from the robot to the ground is directly obtained through the infrared detection module, the second height is calibrated on the basis of the first height, the imaging center of the depth image is obtained, and the connection line and the included angle are determined, so that the accuracy of the measurement height of the depth image acquisition module is improved.
Referring to fig. 5, fig. 5 is a schematic flow chart illustrating a height calibration method according to yet another embodiment of the present application. The method is applied to a robot, and will be described in detail with respect to the flow shown in fig. 5, and the height calibration method may specifically include the following steps:
step S410: when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired.
Step S420: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
Step S430: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
For detailed description of steps S410 to S430, refer to steps S110 to S130, which are not described herein again.
Step S440: comparing the first height with the second height to obtain the size relation between the first height and the second height;
in some embodiments, there are 3 cases in which the magnitude relationship between the first height and the second height is, for example, the first height is greater than the second height, the first height is less than the second height, and the first height and the second height are equal in magnitude, which is not limited herein.
In one embodiment, when the first height is obtained, the corresponding second height should also be obtained, the values of the first height and the second height are confirmed, and the magnitude comparison is performed between the values of the first height and the second height, so as to obtain the magnitude relationship between the first height and the second height.
Step S450: calibrating the second height based on the magnitude relationship.
As an embodiment, after obtaining a magnitude relationship between the first height and the second height, a calibration parameter may be determined based on the magnitude relationship, and the second height is calculated based on the calibration parameter to obtain a calibrated second height, where a height difference between the calibrated second height and the first height is smaller than a preset height difference. In some embodiments, calculating the second height based on the calibration parameter comprises: the second height is added with the calibration parameter, subtracted with the calibration parameter, multiplied by the calibration parameter, or divided by the calibration parameter, which is not limited herein.
In the height calibration method provided by another embodiment of the present application, a robot, that is, a robot, acquires a depth image according to a depth image acquisition module, and acquires a height from the depth image acquisition module to the ground on the basis of the depth image, so as to obtain a first height by calculation according to the acquired height and a preset height difference, and then directly acquire a second height from the robot to the ground by using an infrared detection module, and compare the first height with the second height to obtain a size relationship between the first height and the second height, and calibrate the heights based on the size relationship, and calibrate the second height based on the size relationship, thereby improving accuracy of height calibration.
Referring to fig. 6, fig. 6 is a schematic flow chart illustrating a height calibration method according to still another embodiment of the present application. The method is applied to a robot, and will be described in detail with respect to the flow shown in fig. 6, and the height calibration method may specifically include the following steps:
step S510: when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired.
Step S520: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
Step S530: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
Step S540: calibrating the second height based on the first height.
For the detailed description of steps S510 to S540, refer to steps S110 to S140, which are not described herein again.
Step S550: obtaining calibration parameters for calibrating the second height based on the first height.
In some embodiments, after obtaining the first height and the second height, a calibration parameter for calibrating the second height based on the first height may be obtained correspondingly. As one way, after obtaining the first height and the second height, a calibration method may be obtained, and calibration parameters for calibrating the second height may be determined by the calibration method. For example, when the calibration method is addition, the calibration parameter corresponds to adding a target value; when the calibration method is subtraction, the calibration parameter is correspondingly subtracted by a target value; when the calibration method is multiplication, the calibration parameter is correspondingly multiplied by a target numerical value; when the calibration method is additive, then the calibration parameter corresponds to a division by a target value.
Step S560: and saving the calibration parameters.
In some embodiments, after obtaining the calibration parameters, the calibration parameters may be saved locally, so that in a later calibration process, the calibration parameters may be directly obtained locally to calibrate the second altitude directly, so as to accelerate the calibration efficiency of the second altitude. For example, in a subsequent calibration process, corresponding calibration parameters may be directly obtained locally from the detected first height and the detected second height.
In another embodiment of the height calibration method provided by the present application, the robot acquires a depth image according to the depth image acquisition module, acquires a height from the depth image acquisition module to the ground based on the depth image, thereby acquiring a first height, directly acquires a second height through the infrared detection module, compares the first height and the second height to acquire a size relationship, acquires a calibration parameter based on the size relationship between the first height and the second height, and finally stores the calibration parameter. The calibration parameters are stored, and after the first height and the second height are obtained, the calibration parameters can be directly called locally, so that the rapidness and the simplicity of height calibration are improved.
Referring to fig. 7, fig. 7 is a schematic flow chart illustrating a height calibration method according to still another embodiment of the present application. The method is applied to a robot, and will be described in detail with respect to the flow shown in fig. 7, and the height calibration method may specifically include the following steps:
step S610: when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired.
Step S620: and obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference.
Step S630: and acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height.
Step S640: calibrating the second height based on the first height.
For the detailed description of steps S610 to S640, refer to steps S110 to S140, which are not described herein again.
Step S650: obtaining calibration parameters for calibrating the second height based on the first height.
For detailed description of step S650, please refer to step S550, which is not described herein.
Step S660: and calibrating the plurality of second infrared detection modules based on the calibration parameters.
In some embodiments, if there are a plurality of infrared detection modules, the plurality of infrared detection modules may be all highly calibrated.
In this embodiment, after the calibration parameter for calibrating the second height based on the first height is obtained, all of the plurality of second infrared detection modules are calibrated based on the determined calibration parameter. For example, when the calibration parameter is the addition of the determined value, the determined value is added to a plurality of heights acquired by the plurality of second infrared detection modules at the same time; when the calibration parameter is the subtraction determination value, subtracting the determination value from a plurality of heights acquired by the plurality of second infrared detection modules at the same time; when the calibration parameter is multiplied by a determined value, simultaneously multiplying a plurality of heights acquired by the plurality of second infrared detection modules by the determined value; and when the calibration parameter is divided by the determined value, simultaneously dividing the heights acquired by the second infrared detection modules by the determined value.
In yet another embodiment of the present application, a robot acquires a depth image according to a depth image acquisition module, and processes the depth image, so as to acquire a height from the depth image acquisition module to the ground, and then calculates a first height based on a preset height difference, and directly acquires a reflected infrared signal intensity through an infrared detection module, so as to acquire a second height, and compares the acquired first height and second height to obtain a relationship between the first height and the second height, so as to acquire a calibration parameter corresponding to the relationship between the first height and the second height, and calibrates a plurality of second infrared detection modules based on the calibration parameter, and calibrates the plurality of second infrared detection modules based on the calibration parameter, thereby improving accuracy of the height acquired by the robot, and ensuring that the robot can accurately operate.
Referring to fig. 8, fig. 8 is a block diagram illustrating a height calibration apparatus 200 according to an embodiment of the present disclosure. The height calibration device is applied to the robot, and will be explained with reference to the block diagram shown in fig. 8, and the height calibration device 200 includes: height acquisition module 210, first height acquisition module 220, first height acquisition module 230, and calibration module 240, wherein:
a height obtaining module 210, configured to obtain, when the robot is horizontally placed on the ground, a height of the depth image capturing module from the ground, which is detected by the depth image capturing module.
A first height obtaining module 220, configured to obtain, based on the height of the depth image acquisition module from the ground and the preset height difference, a height of the robot from the ground as a first height.
A second height obtaining module 230, configured to obtain a height of the robot from the ground, which is detected by the infrared detection module, as a second height.
A calibration module 240 configured to calibrate the second height based on the first height.
Optionally, the height obtaining module 210 includes a depth image collecting sub-module and a height obtaining sub-module, where:
and the depth image acquisition sub-module is used for acquiring a depth image acquired by the depth image acquisition module when the robot is horizontally placed on the ground.
And the height acquisition submodule is used for acquiring the height of the depth image acquisition module from the ground based on the depth image.
Further, the height acquisition sub-module includes: the included angle obtains unit, distance and obtains unit and height and obtains the unit, wherein:
and the included angle acquisition unit is used for acquiring a connecting line between the optical center of the depth image acquisition module and the imaging center of the depth image and acquiring an included angle between the connecting line and the ground.
And the distance acquisition unit is used for acquiring the distance between the depth image acquisition module and the imaging center based on the depth image.
And the height acquisition unit is used for acquiring the height of the depth image acquisition module from the ground based on the distance and the included angle.
Optionally, the calibration module 240 includes a size relation comparison sub-module and a calibration sub-module, where:
and the size relation comparison submodule is used for comparing the first height with the second height to obtain the size relation between the first height and the second height.
A calibration sub-module for calibrating the second height based on the magnitude relationship.
Further, the calibration sub-module includes: a calibration parameter determination unit and a calibration unit, wherein:
and the calibration parameter determining unit is used for determining a calibration parameter based on the size relation.
And the standard unit is used for calculating the second height based on the calibration parameter to obtain a calibrated second height, wherein the height difference between the calibrated second height and the first height is less than a preset height difference.
Further, the height calibration apparatus 200 further includes: calibration parameter acquisition module and calibration parameter save module, wherein:
and the calibration parameter acquisition module is used for acquiring a calibration parameter for calibrating the second height based on the first height.
And the calibration parameter storage module is used for storing the calibration parameters.
Further, the number of the infrared detection modules is a plurality of, and a plurality of infrared detection modules include first infrared detection module and a plurality of second infrared detection module, the second height is obtained through the detection of first infrared detection module, height calibration apparatus 200 further includes: calibration parameter acquisition module and calibration module, wherein:
and the calibration parameter acquisition module is used for acquiring a calibration parameter for calibrating the second height based on the first height.
And the calibration module is used for calibrating the plurality of second infrared detection modules based on the calibration parameters.
Referring to fig. 9, a block diagram of a robot 100 according to an embodiment of the present disclosure is shown. The robot 100 may be a transfer robot, a recognition robot, or the like that can move on the floor. The robot 100 in the present application may include one or more of the following components: a processor 110, a memory 120, a depth image acquisition module 130, an infrared detection module 140, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform the methods as described in the foregoing method embodiments.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The data storage area may also store data created by the electronic device 100 during use (e.g., phone book, audio-video data, chat log data), and the like.
The depth image acquisition module 130 is configured to acquire a depth image and obtain a distance based on the depth image.
The infrared detection module 140 is used for transmitting and receiving infrared signals, so as to calculate the distance.
Referring to fig. 10, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 300 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 300 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable and programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 300 includes a non-volatile computer-readable storage medium. The computer readable storage medium 300 has storage space for program code 310 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 310 may be compressed, for example, in a suitable form.
To sum up, according to the height calibration method, the height calibration device and the robot provided by the embodiment of the application, when the robot is horizontally placed on the ground, the height from the depth image acquisition module to the horizontal ground, which is detected by the depth image acquisition module, is obtained, the robot distance from the ground is obtained as a first height according to the height from the depth image acquisition module to the ground and the preset height difference existing between the robot and the infrared detection module, which is installed according to the height from the depth image acquisition module to the ground and the depth image acquisition module, the height from the robot to the ground is obtained as a second height, and finally the second height is calibrated based on the first height, so that the robot can accurately operate, and the accuracy of height measurement is improved.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not drive the corresponding technical solutions.
Claims (10)
1. A height calibration method is applied to a robot, the robot comprises a depth image acquisition module and an infrared detection module, the infrared detection module is installed at the bottom of the robot, the depth image acquisition module is installed on the robot and has a preset height difference with the infrared detection module, and the method comprises the following steps:
when the robot is horizontally placed on the ground, the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, is acquired;
obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference;
acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height;
calibrating the second height based on the first height.
2. The method of claim 1, wherein the obtaining the height of the depth image acquisition module from the ground detected by the depth image acquisition module when the robot is horizontally placed on the ground comprises:
when the robot is horizontally placed on the ground, acquiring a depth image acquired by a depth image acquisition module;
and obtaining the height of the depth image acquisition module from the ground based on the depth image.
3. The method of claim 2, wherein obtaining the height of the depth image acquisition module from the ground based on the depth image comprises:
acquiring a connecting line between the optical center of the depth image acquisition module and the imaging center of the depth image, and acquiring an included angle between the connecting line and the ground;
based on the depth image, acquiring the distance between a depth image acquisition module and the imaging center;
and obtaining the height of the depth image acquisition module from the ground based on the distance and the included angle.
4. The method of claim 1, wherein the obtaining the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference comprises:
based on H0Calculating to obtain a height of the robot from the ground as a first height, wherein H is0And for the first height, L sin (α) is the height from the depth image acquisition module to the ground, h is the preset height difference, L is the distance, and α is the included angle.
5. The method of any one of claims 1-4, wherein calibrating the second height based on the first height comprises:
comparing the first height with the second height to obtain the size relation between the first height and the second height;
calibrating the second height based on the magnitude relationship.
6. The method of claim 5, wherein the calibrating the second height based on the size relationship comprises:
determining a calibration parameter based on the magnitude relationship;
and calculating the second height based on the calibration parameters to obtain a calibrated second height, wherein the height difference between the calibrated second height and the first height is less than a preset height difference.
7. The method of any one of claims 1-4, after calibrating the second height based on the first height, comprising:
obtaining a calibration parameter that calibrates the second height based on the first height;
and saving the calibration parameters.
8. The method according to any one of claims 1 to 4, wherein the number of the infrared detection modules is plural, the plural infrared detection modules include a first infrared detection module and plural second infrared detection modules, the second height is obtained by detection of the first infrared detection module, and after calibrating the second height based on the first height, the method further includes:
obtaining a calibration parameter that calibrates the second height based on the first height;
and calibrating the plurality of second infrared detection modules based on the calibration parameters.
9. The utility model provides a height calibrating device, its characterized in that is applied to the robot, the robot includes depth image acquisition module and infrared detection module, infrared detection module installs the bottom of robot, depth image acquisition module installs on the robot and with there is the difference in height of predetermineeing in infrared detection module, the device includes:
the height acquisition module is used for acquiring the height of the depth image acquisition module from the ground, which is detected by the depth image acquisition module, when the robot is horizontally placed on the ground;
the first height acquisition module is used for acquiring the height of the robot from the ground as a first height based on the height of the depth image acquisition module from the ground and the preset height difference;
the second height acquisition module is used for acquiring the height of the robot from the ground, which is detected by the infrared detection module, as a second height;
a calibration module to calibrate the second height based on the first height.
10. A robot comprising a memory and a processor, the memory coupled to the processor, the memory storing instructions that, when executed by the processor, the processor performs the method of any of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110859722.2A CN113712473B (en) | 2021-07-28 | 2021-07-28 | Height calibration method and device and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110859722.2A CN113712473B (en) | 2021-07-28 | 2021-07-28 | Height calibration method and device and robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113712473A true CN113712473A (en) | 2021-11-30 |
CN113712473B CN113712473B (en) | 2022-09-27 |
Family
ID=78674253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110859722.2A Active CN113712473B (en) | 2021-07-28 | 2021-07-28 | Height calibration method and device and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113712473B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016081373A (en) * | 2014-10-20 | 2016-05-16 | シャープ株式会社 | Autonomous type electric device with floor face detection sensor, and sensitivity adjustment method of floor face detection sensor |
CN106020201A (en) * | 2016-07-13 | 2016-10-12 | 广东奥讯智能设备技术有限公司 | Mobile robot 3D navigation and positioning system and navigation and positioning method |
CN107028559A (en) * | 2017-04-25 | 2017-08-11 | 湖南格兰博智能科技有限责任公司 | A kind of sweeper and its anti-fall method |
CN109049005A (en) * | 2018-07-30 | 2018-12-21 | 苏州穿山甲机器人股份有限公司 | The installation bearing calibration of dropproof infrared sensor |
CN110852312A (en) * | 2020-01-14 | 2020-02-28 | 深圳飞科机器人有限公司 | Cliff detection method, mobile robot control method, and mobile robot |
JP2020052601A (en) * | 2018-09-26 | 2020-04-02 | パナソニックIpマネジメント株式会社 | Autonomous travel cleaner and control method |
CN112022023A (en) * | 2020-07-16 | 2020-12-04 | 湖南格兰博智能科技有限责任公司 | Adaptive calibration learning method for ground sensor of sweeper |
CN112051844A (en) * | 2020-08-17 | 2020-12-08 | 尚科宁家(中国)科技有限公司 | Self-moving robot and control method thereof |
CN112826393A (en) * | 2020-12-30 | 2021-05-25 | 北京奇虎科技有限公司 | Sweeping robot operation management method, sweeping robot, equipment and storage medium |
CN113146683A (en) * | 2021-03-18 | 2021-07-23 | 深兰科技(上海)有限公司 | Robot chassis and robot |
-
2021
- 2021-07-28 CN CN202110859722.2A patent/CN113712473B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016081373A (en) * | 2014-10-20 | 2016-05-16 | シャープ株式会社 | Autonomous type electric device with floor face detection sensor, and sensitivity adjustment method of floor face detection sensor |
CN106020201A (en) * | 2016-07-13 | 2016-10-12 | 广东奥讯智能设备技术有限公司 | Mobile robot 3D navigation and positioning system and navigation and positioning method |
CN107028559A (en) * | 2017-04-25 | 2017-08-11 | 湖南格兰博智能科技有限责任公司 | A kind of sweeper and its anti-fall method |
CN109049005A (en) * | 2018-07-30 | 2018-12-21 | 苏州穿山甲机器人股份有限公司 | The installation bearing calibration of dropproof infrared sensor |
JP2020052601A (en) * | 2018-09-26 | 2020-04-02 | パナソニックIpマネジメント株式会社 | Autonomous travel cleaner and control method |
CN110852312A (en) * | 2020-01-14 | 2020-02-28 | 深圳飞科机器人有限公司 | Cliff detection method, mobile robot control method, and mobile robot |
CN112022023A (en) * | 2020-07-16 | 2020-12-04 | 湖南格兰博智能科技有限责任公司 | Adaptive calibration learning method for ground sensor of sweeper |
CN112051844A (en) * | 2020-08-17 | 2020-12-08 | 尚科宁家(中国)科技有限公司 | Self-moving robot and control method thereof |
CN112826393A (en) * | 2020-12-30 | 2021-05-25 | 北京奇虎科技有限公司 | Sweeping robot operation management method, sweeping robot, equipment and storage medium |
CN113146683A (en) * | 2021-03-18 | 2021-07-23 | 深兰科技(上海)有限公司 | Robot chassis and robot |
Also Published As
Publication number | Publication date |
---|---|
CN113712473B (en) | 2022-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10922834B2 (en) | Method and apparatus for determining volume of object | |
US10228772B2 (en) | Remote controller | |
CN113313658B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
KR20210099371A (en) | The method and System of Wave Observation Using Camera Module for Ocean Observation Buoy | |
CN112863187B (en) | Detection method of perception model, electronic equipment, road side equipment and cloud control platform | |
CN110781779A (en) | Object position detection method and device, readable storage medium and electronic equipment | |
CN111881832A (en) | Lane target detection method, device, equipment and computer readable storage medium | |
CN111080665B (en) | Image frame recognition method, device, equipment and computer storage medium | |
CN115685249A (en) | Obstacle detection method and device, electronic equipment and storage medium | |
CN113712473B (en) | Height calibration method and device and robot | |
CN112989998B (en) | Material monitoring method, material monitoring device and stirring station | |
US20220091265A1 (en) | Mobile robot generating resized region of interest in image frame and using dual-bandpass filter | |
CN115728772A (en) | Laser scanning point type detection method and device and terminal equipment | |
CN113326749B (en) | Target detection method and device, storage medium and electronic equipment | |
CN116413701A (en) | Method and device for determining interference point, storage medium and multichannel laser radar | |
US6597805B1 (en) | Visual inspection method for electronic device, visual inspecting apparatus for electronic device, and record medium for recording program which causes computer to perform visual inspecting method for electronic device | |
CN110309741B (en) | Obstacle detection method and device | |
CN113156453A (en) | Moving object detection method, apparatus, device and storage medium | |
CN112218098A (en) | Data compression method and device, electronic equipment and storage medium | |
CN113673286B (en) | Depth reconstruction method, system, equipment and medium based on target area | |
CN116311540B (en) | Human body posture scanning method, system and medium based on 3D structured light | |
CN113673285B (en) | Depth reconstruction method, system, equipment and medium during capturing of depth camera | |
CN113673284B (en) | Depth camera snapshot method, system, equipment and medium | |
CN116258714B (en) | Defect identification method and device, electronic equipment and storage medium | |
CN113673287B (en) | Depth reconstruction method, system, equipment and medium based on target time node |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |