CN110916562A - Autonomous mobile device, control method, and storage medium - Google Patents

Autonomous mobile device, control method, and storage medium Download PDF

Info

Publication number
CN110916562A
CN110916562A CN201811089507.3A CN201811089507A CN110916562A CN 110916562 A CN110916562 A CN 110916562A CN 201811089507 A CN201811089507 A CN 201811089507A CN 110916562 A CN110916562 A CN 110916562A
Authority
CN
China
Prior art keywords
array laser
autonomous mobile
area array
laser sensor
mobile device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811089507.3A
Other languages
Chinese (zh)
Inventor
岑斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201811089507.3A priority Critical patent/CN110916562A/en
Priority to PCT/CN2019/095962 priority patent/WO2020038155A1/en
Priority to EP19852597.4A priority patent/EP3842885A4/en
Priority to US16/542,218 priority patent/US20200064481A1/en
Publication of CN110916562A publication Critical patent/CN110916562A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L1/00Cleaning windows
    • A47L1/02Power-driven machines or devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Robotics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Acoustics & Sound (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An embodiment of the application provides an autonomous mobile device, a control method and a storage medium. In the embodiment of the application, the autonomous mobile device performs environment sensing based on the environment information acquired by the area array laser sensor to further complete various functions, wherein the environment information acquired by the area array laser sensor contains high-precision and high-resolution direction and distance information and reflectivity information, environmental features with matching and identification values can be acquired from the environment information, the autonomous mobile device has strong environment identification capability, the spatial understanding of the autonomous mobile device to the environment is favorably improved, and compared with a sensing scheme based on an image sensor, more accurate distance and direction information can be provided, the complexity of sensing operation can be reduced, and the real-time performance is improved.

Description

Autonomous mobile device, control method, and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to an autonomous mobile device, a control method, and a storage medium.
Background
With the development of artificial intelligence technology, research on autonomous mobile devices such as robots and the like is gradually and deeply conducted. The autonomous mobile device can sense and interact with an external environment, and is a basis on which the autonomous mobile device can autonomously move and execute tasks. At present, most of autonomous mobile equipment senses external environments through sensors such as a single-line laser radar, a multi-line laser radar and an image sensor, and autonomous requirements for obstacle identification, positioning and the like are met.
However, the existing schemes based on sensing the external environment by sensors such as a single line laser radar, a multi-line laser radar and an image sensor all have certain defects. For example, the sensing scheme based on the image sensor has high computational complexity and low real-time performance. The perception scheme based on the single-line or multi-line laser radar has a certain limit in space comprehension capability. Therefore, there is a need to provide a new sensing scheme for autonomous mobile devices.
Disclosure of Invention
Aspects of the present disclosure provide an autonomous mobile device, a control method, and a storage medium, which are used to improve spatial comprehension of the autonomous mobile device to an environment, reduce computational complexity, and improve real-time performance.
An embodiment of the present application provides an autonomous mobile device, including: the device comprises a device body, wherein a control unit and an area array laser sensor are arranged on the device body, and the control unit is electrically connected with the area array laser sensor; the area array laser sensor is used for acquiring environmental information in the operating environment of the autonomous mobile equipment and transmitting the environmental information to the control unit; and the control unit is used for carrying out function control on the autonomous mobile equipment according to the environment information.
An embodiment of the present application further provides a method for controlling an autonomous mobile device, including: controlling an area array laser sensor on the autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment; acquiring the environment information output by the area array laser sensor; and performing function control on the autonomous mobile equipment according to the environment information.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform the following: controlling an area array laser sensor on autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment; acquiring the environment information output by the area array laser sensor; and performing function control on the autonomous mobile equipment according to the environment information.
In the embodiment of the application, the autonomous mobile device performs environment sensing based on the environment information acquired by the area array laser sensor to further complete various functions, wherein the environment information acquired by the area array laser sensor contains high-precision and high-resolution direction and distance information and reflectivity information, environmental features with matching and identification values can be acquired from the environment information, the autonomous mobile device has strong environment identification capability, the spatial understanding of the autonomous mobile device to the environment is favorably improved, and compared with a sensing scheme based on an image sensor, more accurate distance and direction information can be provided, the complexity of sensing operation can be reduced, and the real-time performance is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1a is a schematic diagram of a hardware structure of an autonomous mobile device according to an exemplary embodiment of the present application;
fig. 1b is a schematic diagram of a hardware structure of another autonomous mobile device provided in an exemplary embodiment of the present application;
2a-2d are schematic external profiles of several sweeping robots provided by the exemplary embodiment of the present application;
FIG. 3 is a highly schematic view of an unmanned vehicle provided in an exemplary embodiment of the present application;
fig. 4a is a schematic diagram illustrating a state where an area array laser sensor is disposed on an autonomous mobile device with a circular outer contour according to an exemplary embodiment of the present application;
4 b-4 d are front, top and side views, respectively, of a cylindrical autonomous mobile apparatus provided with an area-array laser sensor according to an exemplary embodiment of the present disclosure;
4 e-4 g are schematic diagrams of an area array laser sensor arranged at the middle position, the top position or the bottom position of an autonomous mobile device with a square outer contour according to an exemplary embodiment of the present application;
fig. 5a and 5b are a front view and a top view of the autonomous mobile device with a circular outer contour in a state where the farthest visual distance ends of the horizontal visual angles of the adjacent area array laser sensors intersect;
fig. 5c and 5d are a front view and a top view of the autonomous moving apparatus with a circular outer contour in a state where horizontal view angles of adjacent area array laser sensors are partially overlapped;
fig. 5e and 5f are a front view and a top view of the autonomous moving apparatus with a circular outer contour in a state where the boundaries of the horizontal viewing angles of the adjacent area array laser sensors are parallel, respectively;
fig. 6a and 6b are a front view and a top view of the autonomous mobile device with a triangular outer contour in a state where the farthest visual distance ends of the horizontal visual angles of the adjacent area array laser sensors intersect;
fig. 6c and 6d are a front view and a top view of the autonomous moving apparatus with a triangular outer contour in a state where horizontal viewing angles of adjacent area array laser sensors are partially overlapped;
fig. 6e and 6f are a front view and a top view of the autonomous moving apparatus with a triangular outer contour and a parallel boundary of horizontal viewing angles of adjacent area array laser sensors;
fig. 7 is a flowchart illustrating a control method for an autonomous mobile device according to an exemplary embodiment of the present application;
fig. 8 is a style diagram of a 3D grid map constructed based on environmental information collected by an area array laser sensor according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the defect that the existing autonomous mobile equipment has the perception aspect to the external environment, the embodiment of the application provides a solution, an area array laser sensor is applied to the autonomous mobile equipment, and the autonomous mobile equipment conducts environment perception based on the environment information collected by the area array laser sensor so as to complete various functions. The environment information collected by the area array laser sensor comprises high-precision and high-resolution direction and distance information and reflectivity information, the environment characteristics with matching and identification values can be obtained, the environment identification capability is strong, the space comprehension of the autonomous mobile equipment to the environment is improved, and compared with a sensing scheme based on an image sensor, the distance and direction information which is more accurate can be provided, the complexity of sensing operation can be reduced, and the real-time performance is improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1a is a schematic diagram of a hardware structure of an autonomous mobile device according to an exemplary embodiment of the present application. As shown in fig. 1a, the autonomous mobile device includes: the device comprises a device body 101, wherein a control unit 102 and an area array laser sensor 103 are arranged on the device body 101, and the control unit 102 is electrically connected with the area array laser sensor 103.
The autonomous moving apparatus of the present embodiment may be any mechanical apparatus capable of performing spatial movement highly autonomously in its working environment, and may be, for example, an unmanned vehicle, an unmanned aerial vehicle, a robot, or the like. The autonomous moving device may be a cleaning robot, other service robots, or the like. The cleaning robot is a robot capable of autonomously performing a cleaning task in its working environment, and includes a floor cleaning robot, a glass cleaning robot, and the like. Other service type robots refer to robots that can autonomously move in their working environment and provide non-cleaning services, and include air cleaning robots, home accompanying robots, guest welcoming robots, and the like.
Of course, the shape of the autonomous mobile device may vary depending on the implementation of the autonomous mobile device. The present embodiment does not limit the form of the autonomous moving apparatus, and takes the outer contour shape of the autonomous moving apparatus as an example, the outer contour shape of the autonomous moving apparatus may be an irregular shape or some regular shapes. For example, the outer contour shape of the autonomous mobile apparatus may be a regular shape such as a circle, an ellipse, a square, a triangle, a drop, or a D-shape. Accordingly, what is called irregular shapes other than the regular shape, such as an outer contour of a humanoid robot, an outer contour of an unmanned vehicle, and an outer contour of an unmanned vehicle, belong to the irregular shape.
Alternatively, taking the autonomous moving apparatus as a sweeping robot for example, as shown in fig. 2a-2d, the outer contour of the sweeping robot may be circular, oval, square, or triangular, etc.
In order for an autonomous mobile device to be able to autonomously move in its working environment, it is necessary to sense its working environment. In this embodiment, the autonomous mobile device is provided with an area array laser sensor 103, and the area array laser sensor 103 collects environmental information in the working environment of the autonomous mobile device and transmits the collected environmental information to the control unit 102; the control unit 102 receives the environmental information transmitted by the area array laser sensor 103, and senses the operation environment of the autonomous mobile device according to the environmental information to perform function control on the autonomous mobile device.
The area array laser sensor 103 mainly includes a laser emitting array and an information collecting module, and the information collecting module can collect an environment image and also can receive reflected light returned by laser on an object. The information acquisition module can comprise components such as a camera. The working principle of the area array laser sensor 103 is as follows: the laser emission array emits light sources outwards through the optical imaging system in front of the laser emission array, and after the emitted light sources reach the surface of an object, a part of the emitted light sources are reflected back and form pixel points on an image through the optical imaging system in front of the information acquisition module. Because the distances from the surface of the object to the return point are different, the flight Time (TOF) of reflected light is different, and independent distance information can be obtained by each pixel point through measuring the flight time of the reflected light, and the detection range can reach more than one hundred meters. In addition, the information acquisition module of the area array laser sensor 103 can also acquire images of the surrounding environment, so that fast 3D imaging with resolution of megapixels is realized, and the imaging frequency is more than 30 frames per second.
The environmental information collected by the area array laser sensor 103 not only contains direction and distance information, but also adds reflectivity information of the object surface, and can realize the cognitive ability of environmental elements by assisting the deep learning technology in a three-dimensional scene. When the number of laser lines is large and dense, the data formed by the reflectivity information can be regarded as texture information, environmental features with matching and identification values can be obtained from the texture information, the environment identification capability is strong, and advantages brought by a visual algorithm and the texture information can be enjoyed to a certain extent. Therefore, the area array laser sensor 103 well combines the advantages of a line laser sensor and a vision sensor, is beneficial to improving the space comprehension of the autonomous mobile equipment to the environment, is also beneficial to qualitatively improving the obstacle recognition performance of the autonomous mobile equipment, and even enables the space comprehension capability of the autonomous mobile equipment to the environment to reach the level of human eyes; in addition, compared with a sensing scheme based on an image sensor, the area array laser sensor 103 can provide more accurate distance and direction information, the complexity of sensing operation can be reduced, and the real-time performance is improved.
Of course, in addition to the above advantages, the planar array laser sensor 103 has the following advantages: 1) the area array laser sensor has the advantages of solid stating, low cost and miniaturization; 2) when the area array laser sensor is installed and used, a rotating part is not needed, the structure and the size of the sensor can be greatly compressed, the service life is prolonged, and the cost is reduced; 3) the visual angle of the area array laser sensor can be adjusted, and the area array laser sensor can be adapted to different autonomous mobile devices, so that the scanning speed and the scanning precision can be accelerated; 4) the area array laser sensor can simultaneously collect environmental information in the horizontal direction and the vertical direction, can build a 3D map, and is beneficial to improving the accuracy of functions such as positioning, navigation planning and the like based on the map.
It should be noted that, based on the environmental information including three dimensions of direction, distance, and reflectivity collected by the area array laser sensor 103, the autonomous mobile device may be controlled to implement various functions based on environmental perception. For example, the functions of object recognition, tracking, classification and the like on a visual algorithm can be realized; in addition, based on the high precision of laser ranging, the functions of positioning, map building and the like with strong real-time performance, strong robustness and high precision can be realized, and further, the high-precision environment map built can provide omnibearing support for motion planning, path navigation and the like. For a detailed description of these functions, reference may be made to the following method embodiments, which are not described in detail herein.
In some alternative embodiments, as shown in fig. 1b, a pre-processing chip 104 is disposed on the device body 101, and the pre-processing chip 104 is electrically connected to the control unit 102 and the area array laser sensor 103, respectively. The preprocessing chip 104 is mainly used for preprocessing the environmental information collected by the area array laser sensor 103 and outputting the preprocessed environmental information to the control unit 102 before the control unit 102 uses the environmental information collected by the area array laser sensor 103. Accordingly, the control unit 102 may specifically perform function control on the autonomous mobile device according to the preprocessed environment information provided by the preprocessing chip 104.
Optionally, the preprocessing of the environmental information by the preprocessing chip 104 includes one or any combination of parsing, synchronizing, fusing, and filtering. The main role of the analysis is to convert the format of the environmental information collected by the area array laser sensor 103, and analyze the environmental information from the data format supported by the area array laser sensor 103 to the data format supported by the control unit 102. The main effect of synchronization is to align the environmental information collected by the multiple area array laser sensors 103 in time when the multiple area array laser sensors 103 are triggered asynchronously. The fusion mainly functions to integrate environmental information collected by the area array laser sensors 103 when the area array laser sensors 103 exist. The filtering mainly functions to perform smoothing, denoising, edge enhancement, extraction, specific frequency deletion and other processing on the environmental information acquired by the area array laser sensor 103, and mainly includes detection and filtering of abnormal values.
In these alternative embodiments, the preprocessing chip 104 preprocesses the environment information, so that high-quality environment information can be provided to the control unit 102, which is beneficial to reducing the processing load of the control unit 102, improving the information processing efficiency, and further improving the efficiency of function control on the autonomous mobile device.
Further, as shown in fig. 1b, the preprocessing chip 104 and the array laser sensor 103 are integrated into a module, and for convenience of description and distinction, the module is referred to as an environment sensing module 105. Namely, the preprocessing chip 104 and the laser array sensor 103 are integrated in the environment sensing module 105. The environment sensing module 105 is mainly used for collecting environment information in the working environment of the autonomous mobile device, and outputting the collected environment information to the control unit 102 after preprocessing.
Optionally, the environment sensing module 105 may use the area array laser sensor 103 as a main sensor and is responsible for collecting environment information in the operation environment of the autonomous mobile device, in addition, the environment sensing module 105 may further integrate a non-area array laser sensor 106, and the non-area array laser sensor 106 serves as an auxiliary sensor to assist in collecting richer environment information. Alternatively, the non-area array laser sensor 106 may include one or any combination of an ultrasonic sensor, an infrared sensor, a vision sensor, a single line laser sensor, and a multi-line laser sensor. The environmental information collected by various sensors is synthesized, so that the accuracy and precision of environmental perception can be further improved, and the accuracy of function control can be further improved.
Optionally, the preprocessing chip 104 is further electrically connected to the non-area-array laser sensor 106, and performs at least one of parsing, synchronizing, fusing, filtering, and the like on the environmental information acquired by the non-area-array laser sensor 106. The "fusion" herein includes the fusion of environmental information from the area array laser sensor 103 and the non-area array laser sensor 106; accordingly, "synchronizing" herein includes synchronizing environmental information from the area array laser sensor 103 and the non-area array laser sensor 106.
In the above embodiment, the area array laser sensor 103, the non-area array laser sensor 106, and the preprocessing chip 104 are designed in a modular manner, so that the system is convenient to use in various autonomous mobile devices, is also beneficial to reducing the volume of the autonomous mobile devices, and is more convenient in aspects of reuse, upgrade, maintenance, and the like.
It should be noted that, according to the implementation form of the autonomous mobile device, the form, structure, number, view angle range and installation position of the area array laser sensor may be different. The embodiment of the application does not limit the form, structure, number, view angle range and setting position of the area array laser sensor, and can be combined with the realization form of the autonomous mobile equipment and the adaptive selection and setting of application requirements. The following exemplary description is made around an area array laser sensor in several respects:
coverage of the area array laser sensor:
the area array laser sensor has a certain viewing angle in both horizontal and vertical directions, which are referred to as horizontal viewing angle and vertical viewing angle, as shown in fig. 4 a. In fig. 4a, the outer contour of the autonomous mobile apparatus is circular, but is not limited thereto. The horizontal visual angle refers to an effective range of the area array laser sensor on a horizontal plane, and the vertical visual angle refers to an effective range of the area array laser sensor on a vertical plane. The horizontal visual angle and the vertical visual angle of the area array laser sensor are combined to form an effective space range in which the area array laser sensor can collect information. The horizontal visual angle and the vertical visual angle of different area array laser sensors are different. For example, the horizontal viewing angle of some area array laser sensors is 120 degrees, and the vertical viewing angle is 10 degrees; other area array laser sensors have a horizontal viewing angle of 90 degrees and a vertical viewing angle of 9 degrees. For an area array laser sensor, the horizontal visual angle determines the coverage range of the area array laser sensor on the horizontal plane; accordingly, the vertical viewing angle determines the coverage of the area array laser sensor on the vertical plane.
In order to meet the autonomous movement requirement of the autonomous mobile equipment, the coverage ranges of the area array laser sensors arranged on the autonomous mobile equipment on the horizontal plane are overlapped to meet the visual angle requirement of the autonomous mobile equipment on the horizontal plane in normal operation.
Alternatively, the viewing angle requirement of the autonomous mobile device in the normal operation on the horizontal plane may be 60 degrees to 270 degrees, which requires that the coverage of the area array laser sensor arranged on the autonomous mobile device in the horizontal plane is 60 degrees to 270 degrees.
For example, in the case of a transfer robot working in a warehouse, when the transfer robot moves in the warehouse, whether the transfer robot can pass or not needs to be judged according to a horizontal viewing angle of about 60-150 degrees, which requires that the coverage range on the horizontal plane of an area array laser sensor arranged on the transfer robot reaches 60-150 degrees.
For another example, taking a sweeping robot working in a home environment as an example, in order to perform a sweeping task in the home environment, positioning and mapping need to be implemented according to a horizontal viewing angle of about 150-.
For another example, taking a guest greeting robot working in an environment such as a shopping mall as an example, when the guest greeting robot executes a guest greeting task, the guest greeting robot needs to guide a shopping mall and track a target according to a horizontal viewing angle of about 270-.
Of course, if the autonomous mobile device also requires a coverage area on a vertical surface, similar to the coverage area of the area array laser sensor arranged on the autonomous mobile device on a horizontal surface, the coverage area of the area array laser sensor arranged on the autonomous mobile device on the vertical surface also needs to meet the requirement of the viewing angle of the autonomous mobile device normally working on the vertical surface, and details are not described.
It should be noted that the requirement for the viewing angle of the autonomous mobile device in the horizontal plane and the vertical plane may be satisfied by covering with one area array laser sensor, or by covering with at least two area array laser sensors. If the method is realized by adopting an area array laser sensor, the horizontal visual angle of the area array laser sensor needs to be larger than or equal to the horizontal visual angle required by the normal operation of the autonomous mobile equipment; accordingly, the vertical viewing angle of the area array laser sensor needs to be greater than or equal to that required for normal operation of the autonomous mobile device. If the method is realized by adopting at least two area array laser sensors, the horizontal visual angle obtained by overlapping the horizontal visual angles of the at least two area array laser sensors needs to be larger than or equal to the horizontal visual angle required by the normal operation of the autonomous mobile equipment; accordingly, the vertical viewing angle obtained by superimposing the vertical viewing angles of the at least two area array laser sensors needs to be greater than or equal to the vertical viewing angle required for normal operation of the autonomous mobile device.
Taking the transfer robot as an example, if one area array laser sensor is provided on the transfer robot, the area array laser sensor with a horizontal viewing angle of 150 degrees may be selected.
In practical application, some simple application requirements exist, the problem of environment perception can be solved only by a single area array laser sensor, and then for autonomous mobile equipment working in the environments, one area array laser sensor can be arranged. In addition, there are some complex application requirements, at least two area array laser sensors are needed to solve the environmental perception problem, and at least two area array laser sensors are needed for autonomous mobile devices working in these environments. The following describes a case where one area array laser sensor and at least two area array laser sensors are provided:
in the case of an area array laser sensor provided for an autonomous mobile device: the area array laser sensor may be disposed on a front side of the device body 101 of the autonomous moving device, where the front side refers to a side toward which the device body 101 faces during forward movement of the autonomous moving device. An autonomous mobile device may move forward or backward (abbreviated as backward) during movement. The term "forward" is understood here to mean: the direction of movement of the autonomous mobile device is frequent or in most cases during the course of the work. Taking an autonomous moving apparatus with a circular outline as an example, the autonomous moving apparatus is provided with an area array laser sensor as shown in fig. 4 a. In addition, taking a cylindrical autonomous moving apparatus as an example, the autonomous moving apparatus is provided withThe state of an area array laser sensor is shown in fig. 4 b-4 d. The area array laser sensor is arranged on the front side of the equipment body 101 of the autonomous mobile equipment, so that the environmental information in front can be collected more conveniently and accurately in the moving process of the autonomous mobile equipment, the obstacle can be avoided more accurately in the process of moving the autonomous mobile equipment, and the autonomous mobile equipment can move smoothly.
For autonomous mobile devices, there will typically be a certain height, as both the robot shown in fig. 2a-2d and the unmanned vehicle shown in fig. 3 have a certain height. This involves a problem of the position of the area array laser sensor in the height direction of the apparatus body 101. This embodiment is not limited to this, and the setting position of the area array laser sensor in the height direction of the device body 101 may be flexibly selected according to the application requirement and the height of the autonomous mobile device. As an example of the autonomous moving apparatus whose outline is square, as shown in fig. 4e to 4g, the area array laser sensor may be provided at a middle position, a top position, or a bottom position in the height direction of the apparatus body 101. The middle position, the top position, and the bottom position in the height direction of the apparatus body 101 may be specific positions or may be a certain range. Referring to fig. 3, if the height of the highest plane of the autonomous moving apparatus from the ground is H2, the position H2/2 in the height direction of the apparatus body 101 is the middle position, the highest plane, i.e., the position H2, is the top position, and the lowest plane is the bottom position. Or, with reference to fig. 3, if the height of the highest plane of the autonomous moving apparatus from the ground is H2, the region between (H2/2-d) and (H2/2-d) in the height direction of the apparatus body 101 is the middle position, and correspondingly, the region between (H2-d) and H2 is the top position, and the region between 0 and d is the bottom position; wherein d is a natural number greater than 0.
In some application scenarios, it may be necessary to use an area array laser sensor to implement a collision avoidance function. In the scene that needs anticollision, can set for the vertical ascending and vertical decurrent visual angle opening of area array laser sensor according to the anticollision demand. Referring to fig. 3, assuming that the maximum traveling speed of the autonomous mobile device is S _ max and the operation response time is T, the horizontal minimum distance D _ min (i.e., the distance less than which the autonomous mobile device may collide) for the autonomous mobile device to achieve collision avoidance by the area array laser sensor may be calculated as S _ max T. Meanwhile, assuming that the height of the transmitting and receiving port of the area array laser sensor from the ground is H1, and the height of the highest plane of the autonomous mobile device from the ground is H2, the vertical downward view angle opening θ _ down of the area array laser sensor is arctan (H1/D _ min), and the vertical upward view angle opening θ _ up of the area array laser sensor is arctan ((H2-H1)/D _ min) can be calculated. Therefore, for the autonomous mobile device which meets the requirements on the traveling speed, the operation response time and the height, the area array laser sensor with the vertical downward visual angle opening and the vertical upward visual angle opening can be selected to realize the anti-collision function of the autonomous mobile device.
It is worth mentioning that the size of the vertical upward and downward view angle opening of the area array laser sensor can be flexibly adjusted according to different application requirements. The following examples illustrate:
for example, when using an area array laser sensor for object recognition, tracking and classification, the vertical upward view opening may be increased and the vertical downward view opening may be decreased as long as the area array laser sensor is able to cover objects within a suitable distance (e.g., a distance of about 0.6 meters, which is a one-step distance typically maintained by selecting people and people for face-to-face communication).
For another example, during the traveling process of the autonomous moving apparatus, it may be necessary to use an area array laser sensor to detect a height drop on the ground, such as a protrusion of a sundry, a drop of an upper step and a lower step, so as to pre-determine whether the height drop will block the chassis of the autonomous moving apparatus, and if not, the autonomous moving apparatus will pass through the chassis, otherwise, the route is abandoned. In this case, the view angle of the vertical downward opening of the area array laser sensor can be selected from θ _ down calculated by the above formula, and the view angle opening of the vertical upward opening is only required to be on the horizontal line or above.
For another example, when the area array laser sensor is used to realize positioning and mapping, theoretically, the vertical viewing angle of the area array laser sensor only needs to have an opening, and the size of the opening is not limited. Of course, if the opening of the vertical viewing angle is large to a certain extent, an indoor 3D map may be constructed by this, and the 3D map may cover the entire operation environment of the autonomous mobile apparatus in the length and width directions, and may cover the height of the body of the autonomous mobile apparatus or an area above the height in the height direction, where the height of the body refers to the distance between the highest plane of the autonomous mobile apparatus and the plane where the autonomous mobile apparatus is located.
In addition to the vertical viewing angle required to meet the application requirements, the horizontal viewing angle of the area array laser sensor should be greater than or equal to the horizontal viewing angle required for the autonomous mobile device to operate normally, for example, greater than 90 degrees, 120 degrees, 150 degrees, 180 degrees, etc.
In case of setting at least two area array laser sensors for autonomous mobile device: at least two area array laser sensors are disposed at different positions of the apparatus body 101. For example, at least two area array laser sensors may be disposed around the device body 101, that is, may be disposed around the device body 101, so that the environment information in the working environment of the autonomous mobile device may be acquired from 360 degrees. For another example, at least two area array laser sensors may be disposed at different positions on the front side of the main body 101, where the "front side" also refers to a side toward which the main body 101 faces during forward movement of the autonomous moving apparatus, so that environmental information in front of the autonomous moving apparatus can be collected from multiple angles. The term "forward" is understood here to mean: the direction of movement of the autonomous mobile device is frequent or in most cases during the course of the work.
It should be noted that the at least two area array laser sensors are disposed at different positions on the apparatus body 101, and in addition, the problem of the height of the at least two area array laser sensors in the height direction of the apparatus body 101 needs to be considered. The embodiment does not limit the above, and the setting heights of the at least two area array laser sensors in the height direction of the device body 101 may be flexibly selected according to the application requirements and the height of the autonomous mobile device.
In the alternative embodiment a1, at least two area array laser sensors are provided with a part of the area array laser sensors at the same height on the apparatus body 101. For example, a plurality of area array laser sensors may be arranged at a certain important height position to ensure that more abundant environmental information is collected at the height position.
In the optional embodiment a2, the arrangement heights of the area array laser sensors in the at least two area array laser sensors on the device body 101 are different, so that the environmental information at different height positions can be acquired, and the abundance of the environmental information is improved.
Further, if the setting heights of the area array laser sensors in the at least two area array laser sensors on the apparatus body 101 are different, optionally, the at least two area array laser sensors are located on the same straight line in the height direction of the apparatus body 101.
Further, if the setting heights of the area array laser sensors in the at least two area array laser sensors on the device body 101 are different, optionally, the farthest visual distance ends of the vertical visual angles of the two adjacent area array laser sensors in the height direction of the device body 101 may intersect; alternatively, the vertical viewing angles of two area array laser sensors adjacent in the height direction of the apparatus body 101 partially overlap. Therefore, seamless coverage of at least two area array laser sensors on a vertical plane can be guaranteed, and more abundant environmental information can be collected.
In the alternative embodiment a3, the installation heights of the area array laser sensors on the device body 101 are the same among the at least two area array laser sensors, and in short, the area array laser sensors are installed at the same height.
In addition, when viewed from the horizontal direction of the apparatus body 101, the farthest visual distance ends of the horizontal visual angles of the adjacent area array laser sensors intersect, or the horizontal visual angles of the adjacent area array laser sensors partially overlap, or the boundaries of the horizontal visual angles of the adjacent area array laser sensors are parallel. Taking the outer contour of the autonomous mobile device as a circle as an example, the state that the farthest sight distance ends of the horizontal visual angles of the adjacent area array laser sensors are intersected is shown in fig. 5a and 5 b; the state that the horizontal view angles of the adjacent area array laser sensors are partially overlapped is shown in fig. 5c and 5 d; the state in which the boundaries of the horizontal view angles of the adjacent area array laser sensors are parallel is shown in fig. 5e and 5 f. In addition, taking the outer contour of the autonomous moving apparatus as a triangle as an example, the state where the farthest visual distance ends of the horizontal visual angles of the adjacent area array laser sensors intersect is shown in fig. 6a and 6 b; the state that the horizontal viewing angles of the adjacent area array laser sensors are partially overlapped is shown in fig. 6c and 6 d; the state in which the boundaries of the horizontal view angles of the adjacent area array laser sensors are parallel is shown in fig. 6e and 6 f.
In addition, in the case of at least two area array laser sensors, each area array laser sensor has a horizontal viewing angle and a vertical viewing angle, respectively. Optionally, the horizontal viewing angles of the area array laser sensors in the at least two area array laser sensors may be the same, and certainly, the horizontal viewing angles of some of the area array laser sensors may be the same, or the horizontal viewing angles of the area array laser sensors are different. Similarly, the vertical viewing angles of the area array laser sensors in the at least two area array laser sensors may be the same, and certainly, the vertical viewing angles of some of the area array laser sensors may be the same, or the vertical viewing angles of the area array laser sensors are different.
The following is an example of two area array laser sensors, and the arrangement position and height of the area array laser sensors are exemplarily described. For convenience of description and distinction, the two area array laser sensors are referred to as a first area array laser sensor and a second area array laser sensor.
The first area array laser sensor and the second area array laser sensor are respectively disposed at different positions of the apparatus body 101. Alternatively, the first and second area array laser sensors are provided at 45 ° left and 45 ° right front sides of the apparatus body 101, respectively.
Further alternatively, the first and second area array laser sensors are provided at 45 ° left and 45 ° right front sides on the same height position on the apparatus body 101, respectively. For example, the first and second area array laser sensors are each provided at 45 ° left and right front sides at 45 ° positions on the middle position of the apparatus body 101, or at 45 ° left and right front sides at 45 ° positions on the top position of the apparatus body 101, or at 45 ° left and right front sides at the bottom position of the apparatus body 101.
Further alternatively, the first and second area array laser sensors are provided on the apparatus body 101 at positions of 45 ° on the front left side and 45 ° on the front right side at different height positions, respectively. For example, the first area array laser sensor is provided at a left front side 45 ° position or a right front side 45 ° position on the middle position of the apparatus body 101, and correspondingly, the second area array laser sensor is provided at a right front side 45 ° position or a left front side 45 ° position on the top position of the apparatus body 101. For another example, the first area array laser sensor is provided at a left front side 45 ° position or a right front side 45 ° position on the middle position of the apparatus body 101, and correspondingly, the second area array laser sensor is provided at a right front side 45 ° position or a left front side 45 ° position on the bottom position of the apparatus body 101. For another example, the first area array laser sensor is provided at a left front side 45 ° position or a right front side 45 ° position on the top position of the apparatus body 101, and correspondingly, the second area array laser sensor is provided at a right front side 45 ° position or a left front side 45 ° position on the bottom position of the apparatus body 101.
In addition to the above autonomous mobile devices, embodiments of the present application also provide some control methods applicable to autonomous mobile devices. These control methods are explained below:
fig. 7 is a flowchart illustrating a control method for an autonomous mobile device according to an exemplary embodiment of the present application.
As shown in fig. 7, the method includes:
701. controlling an area array laser sensor on the autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment;
702. acquiring environmental information acquired by an area array laser sensor;
703. and performing function control on the autonomous mobile equipment according to the environment information.
In this embodiment, the area array laser sensor is arranged on the autonomous mobile device, the area array laser sensor is used to collect environmental information in the working environment of the autonomous mobile device, the environmental information includes direction information, position information and reflectivity information of obstacles around the autonomous mobile device, the deep learning technology in a three-dimensional scene is used as an aid, the cognitive ability of environmental elements can be realized, and then the autonomous mobile device can be functionally controlled according to the environmental information.
When the number of laser lines is large and dense, the data formed by the reflectivity information can be regarded as texture information, environmental features with matching and identification values can be obtained from the texture information, the environment identification capability is strong, and advantages brought by a visual algorithm and the texture information can be enjoyed to a certain extent. Therefore, the area array laser sensor well combines the advantages of a line laser sensor and a vision sensor, is beneficial to improving the space comprehension of the autonomous mobile equipment to the environment, is also beneficial to qualitatively improving the obstacle recognition performance of the autonomous mobile equipment, and even enables the space comprehension capability of the autonomous mobile equipment to the environment to reach the level of human eyes; in addition, compared with a sensing scheme based on an image sensor, the area array laser sensor can provide more accurate distance and direction information, the complexity of sensing operation can be reduced, and the real-time performance is improved.
Optionally, the environment information is preprocessed, and the preprocessing includes one or any combination of parsing, synchronizing, fusing, and filtering. For the explanation of parsing, synchronizing, fusing and filtering, reference may be made to the description in the foregoing embodiments, and details are not repeated here.
Optionally, the autonomous mobile device may be controlled according to the environmental information collected by the area array laser sensor, according to at least one of the following functions: controlling the autonomous mobile device to construct an environment map based on the environment information; controlling the autonomous mobile equipment to perform navigation positioning based on the environment information; controlling the autonomous mobile device to perform object recognition based on the environmental information.
Optionally, a 3D environment map may be constructed based on environment information collected by the area array laser sensor. Of course, a 2D environment map may also be constructed if the positioning requirements are met.
Optionally, the area array laser sensor may be combined with a SLAM algorithm to implement a SLAM process based on the area array laser sensor. In this embodiment, the SLAM process based on the area array laser sensor is as follows:
the odometer or other sensors on the autonomous mobile device provide pose prediction data P, and the area array laser sensor provides observation data Z, which is input data of the SLAM algorithm, and the pose prediction data P and the observation data Z can be input together by one frame or multiple frames of data, which are collectively referred to as a group of data. The observation data Z is environment information in the autonomous mobile device working environment acquired by the area array laser sensor.
Preprocessing each group of data, including one or any combination of analysis, synchronization, fusion and filtering; then entering a data association process, wherein the association process comprises data pairing, feature tracking, closed-loop detection and the like; then, Maximum a Posteriori estimation (Maximum a posterioriestimation) is entered to estimate the pose and update the map information accordingly. The feature tracking is to select partial data of an easy-to-identify mark from a group of data as a feature set, such as a boundary line, a centroid, a corner point, and the like, and to realize tracking by matching the feature set when subsequent environmental information appears. For the autonomous mobile device, the autonomous mobile device may return to a previous position during the moving process, and when the autonomous mobile device returns to the previous position, the motion trajectory of the autonomous mobile device forms a closed loop. The closed loop detection mainly refers to a process of effectively identifying whether a closed loop occurs in a motion track of the automatic mobile equipment.
The inter-group data can be paired by using pose prediction data P provided by a speedometer or a repositioning algorithm, for example, the matching between data groups can be realized by using a filtering algorithm, such as a particle filtering and matching algorithm, such as an ICP algorithm, so as to estimate an accurate machine pose P ', and an environment map is established according to the estimated machine pose P' and observation data.
Alternatively, in the process of constructing the environment map, a spatially-segmented density map may be selected, that is, a three-dimensional space is cut into individual small grid voxels, so as to form a three-dimensional (3D) grid map shown in fig. 8, where each small element has probability information to represent its confidence level, and represents its blank, unknown, and occupied states, respectively. The 2D map is a cut of a two-dimensional space, and the 3D map is a cut of a three-dimensional space.
After the 3D grid map is obtained, the environment information acquired by the area array laser sensor is matched with the 3D grid map, and the actual pose P' of the autonomous mobile equipment can be obtained. Further, when the actual pose P' of the autonomous mobile device is known, the more accurate 3D grid map can be continuously constructed according to the light propagation and hitting principle by combining the observation data Z of the area array laser sensor.
The general process of positioning based on the constructed environment map comprises the following steps: firstly, sensors such as a speedometer and the like are used for providing pose prediction data P of the autonomous mobile equipment, environment information acquired by an area array laser sensor is acquired as observation data Z, and then matching can be carried out in a constructed environment map M according to a matching algorithm to obtain the actual pose P' of the autonomous mobile equipment. In the process, if the positioning fails, the actual pose P' can be obtained by executing repositioning, and the positioning problem of the area array laser sensor under the known environment map is realized.
In addition to the apparatus embodiments and method embodiments described above, the present application embodiments also provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform the following: controlling an area array laser sensor on autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment; acquiring the environment information output by the area array laser sensor; and performing function control on the autonomous mobile equipment according to the environment information. Of course, when being executed by one or more processors, the computer program may also cause the one or more processors to perform other related actions, and the related description may refer to the foregoing embodiments, which are not repeated herein.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 701 to 703 may be device a; for another example, the execution main bodies of steps 701 and 702 may be device a, and the execution main body of step 703 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 701, 702, etc., are merely used for distinguishing different operations, and the sequence numbers themselves do not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (34)

1. An autonomous mobile device, comprising: the device comprises a device body, wherein a control unit and an area array laser sensor are arranged on the device body, and the control unit is electrically connected with the area array laser sensor;
the area array laser sensor is used for acquiring environmental information in the operating environment of the autonomous mobile equipment and transmitting the environmental information to the control unit; and the control unit is used for carrying out function control on the autonomous mobile equipment according to the environment information.
2. The autonomous moving apparatus of claim 1, wherein the area array laser sensor is one and is disposed on a front side of the apparatus body; the front side is a side toward which the device body faces during forward movement of the autonomous mobile device.
3. The autonomous moving apparatus of claim 2, wherein the area array laser sensor is provided at a middle position, a top position, or a bottom position in a height direction of the apparatus body.
4. The autonomous mobile apparatus of claim 2, wherein a vertical downward view opening θ _ down of the area array laser sensor is arctan (H1/D _ min), a vertical upward view opening θ _ up is arctan ((H2-H1)/D _ min), and D _ min is S _ max T;
wherein H1 represents the height of the transmitting and receiving port of the area array laser sensor from the ground, H2 represents the height of the highest plane of the autonomous mobile device from the ground, S _ max represents the maximum traveling speed of the autonomous mobile device, and T represents the operation reaction time of the autonomous mobile device.
5. The autonomous mobile apparatus of claim 2, wherein a horizontal viewing angle of the area array laser sensor is greater than or equal to a horizontal viewing angle required for normal operation of the autonomous mobile apparatus.
6. The autonomous mobile apparatus of claim 1, wherein the area array laser sensors are at least two, and the at least two area array laser sensors are disposed at different positions of the apparatus body.
7. The autonomous mobile apparatus of claim 6, wherein a portion of the at least two area-array laser sensors is disposed at the same height on the apparatus body.
8. The autonomous moving apparatus of claim 7, wherein each of the at least two area-array laser sensors is disposed at a different height on the apparatus body.
9. The autonomous mobile apparatus of claim 8 wherein the farthest line-of-sight ends of the vertical views of adjacent area array laser sensors intersect in the apparatus body height direction; or
And the vertical visual angles of the adjacent area array laser sensors are partially overlapped in the height direction of the equipment body.
10. The autonomous moving apparatus of claim 8, wherein the at least two area-array laser sensors are located on the same line in a height direction of the apparatus body.
11. The autonomous moving apparatus of claim 6, wherein the at least two area array laser sensors are arranged at the same height on the apparatus body.
12. The autonomous mobile apparatus of claim 6, wherein the farthest line-of-sight ends of the horizontal view angles of adjacent area array laser sensors intersect in the horizontal direction of the apparatus body; or
The horizontal visual angles of the adjacent area array laser sensors are partially overlapped in the horizontal direction of the equipment body; or
And the boundaries of the horizontal visual angles of the adjacent area array laser sensors in the horizontal direction of the equipment body are parallel.
13. The autonomous mobile apparatus of claim 6 wherein the horizontal viewing angles of the at least two area array laser sensors are the same.
14. The autonomous mobile apparatus of claim 6 wherein the vertical viewing angles of the at least two area array laser sensors are the same.
15. The autonomous mobile apparatus of claim 6, wherein the at least two area array laser sensors are disposed around the apparatus body.
16. The autonomous mobile apparatus of claim 6, wherein the at least two area array laser sensors are disposed at a front side of the apparatus body; the front side is a side toward which the device body faces during forward movement of the autonomous mobile device.
17. The autonomous mobile apparatus of claim 16, wherein the at least two area array laser sensors comprise a first area array laser sensor and a second area array laser sensor; the first area array laser sensor and the second area array laser sensor are respectively arranged at the positions of 45 degrees on the left front side and 45 degrees on the right front side of the equipment body.
18. The autonomous moving apparatus of claim 17, wherein the first and second area array laser sensors are each provided at a middle position in a height direction of the apparatus body; or
The first area array laser sensor and the second area array laser sensor are both arranged at the top of the equipment body in the height direction; or
The first area array laser sensor and the second area array laser sensor are both arranged at the bottom of the equipment body in the height direction; or
The first area array laser sensor is arranged at the middle position in the height direction of the equipment body, and the second area array laser sensor is arranged at the top position in the height direction of the equipment body; or
The first area array laser sensor is arranged at the middle position in the height direction of the equipment body, and the second area array laser sensor is arranged at the bottom position in the height direction of the equipment body; or
The first area array laser sensor is arranged at the top position in the height direction of the equipment body, and the second area array laser sensor is arranged at the bottom position in the height direction of the equipment body.
19. The autonomous moving apparatus of any one of claims 1 to 18, wherein a pre-processing chip is further disposed on the apparatus body, and the pre-processing chip is electrically connected to the control unit and the area array laser sensor, respectively;
the preprocessing chip is used for preprocessing the environmental information acquired by the area array laser sensor and outputting the preprocessed environmental information to the control unit; the control unit is specifically configured to: and performing function control on the autonomous mobile equipment according to the preprocessed environmental information.
20. The autonomous mobile device of claim 19 wherein the preprocessing comprises one or any combination of parsing, synchronization, fusion, and filtering.
21. The autonomous mobile device of claim 19, wherein an environment sensing module is disposed on the device body, and the preprocessing chip and the area array laser sensor are integrated in the environment sensing module.
22. The autonomous mobile device of claim 21, wherein the environmental awareness module further incorporates a non-area-array laser sensor, the non-area-array laser sensor comprising: one or any combination of an ultrasonic sensor, an infrared sensor, a vision sensor, a single-line laser sensor and a multi-line laser sensor.
23. The autonomous mobile apparatus of any of claims 1-18 wherein the control unit is specifically configured to perform at least one of the following functional controls:
constructing an environment map based on the environment information;
performing navigation positioning based on the environment information;
and performing object identification based on the environment information.
24. The autonomous mobile apparatus of claim 23, wherein the environmental information collected by the area array laser sensor comprises: distance information, direction information, and reflectivity information of obstacles around the autonomous mobile device.
25. The autonomous mobile device of claim 24, wherein the control unit is specifically configured to: and constructing a 3D grid map according to the initial pose of the autonomous mobile equipment and the distance information, the direction information and the reflectivity information of the obstacles around the autonomous mobile equipment.
26. The autonomous mobile device of claim 25, wherein the 3D grid map covers the entire work environment of the autonomous mobile device in both length and width directions, and covers an area at or above a fuselage height of the autonomous mobile device in the elevation direction, the fuselage height being a distance between a highest plane of the autonomous mobile device and a plane in which the autonomous mobile device is located.
27. The autonomous mobile apparatus of any of claims 1-18 wherein the area array laser sensor covers 60-270 degrees in the horizontal plane.
28. The autonomous mobile apparatus of claim 27 wherein the area array laser sensor covers 150-270 degrees or 60-150 degrees in the horizontal plane.
29. The autonomous mobile apparatus of any of claims 1-18 wherein the autonomous mobile apparatus has an outer profile shape that is circular, elliptical, square, triangular, drop-shaped, D-shaped, or humanoid.
30. The autonomous mobile apparatus of any of claims 1-18 wherein the autonomous mobile apparatus is a robot.
31. The autonomous mobile apparatus of claim 30, wherein the robot comprises: a sweeping robot, a glass cleaning robot, a family accompanying robot, an air purifying robot or a welcoming robot.
32. A method of controlling an autonomous mobile device, comprising:
controlling an area array laser sensor on the autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment;
acquiring the environmental information acquired by the area array laser sensor;
and performing function control on the autonomous mobile equipment according to the environment information.
33. The control method according to claim 32, further comprising, before performing the function control of the autonomous mobile apparatus according to the environment information:
and preprocessing the environment information, wherein the preprocessing comprises one or any combination of analysis, synchronization, fusion and filtering.
34. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to perform the acts of:
controlling an area array laser sensor on autonomous mobile equipment to acquire environmental information in the operating environment of the autonomous mobile equipment;
acquiring the environment information output by the area array laser sensor;
and performing function control on the autonomous mobile equipment according to the environment information.
CN201811089507.3A 2018-08-22 2018-09-18 Autonomous mobile device, control method, and storage medium Pending CN110916562A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201811089507.3A CN110916562A (en) 2018-09-18 2018-09-18 Autonomous mobile device, control method, and storage medium
PCT/CN2019/095962 WO2020038155A1 (en) 2018-08-22 2019-07-15 Autonomous movement device, control method and storage medium
EP19852597.4A EP3842885A4 (en) 2018-08-22 2019-07-15 Autonomous movement device, control method and storage medium
US16/542,218 US20200064481A1 (en) 2018-08-22 2019-08-15 Autonomous mobile device, control method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811089507.3A CN110916562A (en) 2018-09-18 2018-09-18 Autonomous mobile device, control method, and storage medium

Publications (1)

Publication Number Publication Date
CN110916562A true CN110916562A (en) 2020-03-27

Family

ID=69855791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811089507.3A Pending CN110916562A (en) 2018-08-22 2018-09-18 Autonomous mobile device, control method, and storage medium

Country Status (1)

Country Link
CN (1) CN110916562A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112515560A (en) * 2020-11-06 2021-03-19 珠海市一微半导体有限公司 Method, chip and robot for acquiring cleaning direction through laser data
WO2023045639A1 (en) * 2021-09-23 2023-03-30 追觅创新科技(苏州)有限公司 Method for determining target object, mobile robot, storage medium, and electronic apparatus

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101309430A (en) * 2008-06-26 2008-11-19 天津市亚安科技电子有限公司 Video image preprocessor on basis of FPGA
CN103140097A (en) * 2011-12-05 2013-06-05 西安灵境科技有限公司 Electronic device of pretreat information unit
CN106601247A (en) * 2016-12-30 2017-04-26 歌尔股份有限公司 Speech recognition system for smart home and smart home
CN106681330A (en) * 2017-01-25 2017-05-17 北京航空航天大学 Robot navigation method and device based on multi-sensor data fusion
US20170307736A1 (en) * 2016-04-22 2017-10-26 OPSYS Tech Ltd. Multi-Wavelength LIDAR System
KR20180011510A (en) * 2016-07-25 2018-02-02 인하대학교 산학협력단 Lidar sensor system for near field detection
CN108107417A (en) * 2017-11-07 2018-06-01 北醒(北京)光子科技有限公司 A kind of solid-state face battle array laser radar apparatus
CN108274463A (en) * 2017-01-06 2018-07-13 苏州华兴致远电子科技有限公司 Train Ku Jian robots and Train Parts detection method
CN108318874A (en) * 2018-04-12 2018-07-24 北醒(北京)光子科技有限公司 A kind of face battle array laser radar and mobile platform
CN108445501A (en) * 2018-04-02 2018-08-24 北醒(北京)光子科技有限公司 A kind of more radar anti-crosstalk system and methods based on SLAM technologies

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101309430A (en) * 2008-06-26 2008-11-19 天津市亚安科技电子有限公司 Video image preprocessor on basis of FPGA
CN103140097A (en) * 2011-12-05 2013-06-05 西安灵境科技有限公司 Electronic device of pretreat information unit
US20170307736A1 (en) * 2016-04-22 2017-10-26 OPSYS Tech Ltd. Multi-Wavelength LIDAR System
KR20180011510A (en) * 2016-07-25 2018-02-02 인하대학교 산학협력단 Lidar sensor system for near field detection
CN106601247A (en) * 2016-12-30 2017-04-26 歌尔股份有限公司 Speech recognition system for smart home and smart home
CN108274463A (en) * 2017-01-06 2018-07-13 苏州华兴致远电子科技有限公司 Train Ku Jian robots and Train Parts detection method
CN106681330A (en) * 2017-01-25 2017-05-17 北京航空航天大学 Robot navigation method and device based on multi-sensor data fusion
CN108107417A (en) * 2017-11-07 2018-06-01 北醒(北京)光子科技有限公司 A kind of solid-state face battle array laser radar apparatus
CN108445501A (en) * 2018-04-02 2018-08-24 北醒(北京)光子科技有限公司 A kind of more radar anti-crosstalk system and methods based on SLAM technologies
CN108318874A (en) * 2018-04-12 2018-07-24 北醒(北京)光子科技有限公司 A kind of face battle array laser radar and mobile platform

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112515560A (en) * 2020-11-06 2021-03-19 珠海市一微半导体有限公司 Method, chip and robot for acquiring cleaning direction through laser data
WO2023045639A1 (en) * 2021-09-23 2023-03-30 追觅创新科技(苏州)有限公司 Method for determining target object, mobile robot, storage medium, and electronic apparatus

Similar Documents

Publication Publication Date Title
JP7341652B2 (en) Information processing device, information processing method, program, and system
CN110023867B (en) System and method for robotic mapping
CN108290294B (en) Mobile robot and control method thereof
KR102670610B1 (en) Robot for airport and method thereof
US10665115B2 (en) Controlling unmanned aerial vehicles to avoid obstacle collision
Sabattini et al. The pan-robots project: Advanced automated guided vehicle systems for industrial logistics
CN104536445B (en) Mobile navigation method and system
AU2011352997B2 (en) Mobile human interface robot
EP3842885A1 (en) Autonomous movement device, control method and storage medium
Saha et al. A real-time monocular vision-based frontal obstacle detection and avoidance for low cost UAVs in GPS denied environment
EP3347171B1 (en) Using sensor-based observations of agents in an environment to estimate the pose of an object in the environment and to estimate an uncertainty measure for the pose
EP3656138A1 (en) Aligning measured signal data with slam localization data and uses thereof
KR101642828B1 (en) Obstacle avoidance system and method based on multiple images
WO2020051923A1 (en) Systems And Methods For VSLAM Scale Estimation Using Optical Flow Sensor On A Robotic Device
Chatterjee et al. Mobile robot navigation
KR20220129218A (en) Speed control method of unmanned vehicle to awareness the flight situation about an obstacle, and, unmanned vehicle the performed the method
CN110751336B (en) Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier
US11400593B2 (en) Method of avoiding collision, robot and server implementing thereof
Pritzl et al. Cooperative navigation and guidance of a micro-scale aerial vehicle by an accompanying UAV using 3D LiDAR relative localization
CN110916562A (en) Autonomous mobile device, control method, and storage medium
Yuan et al. Laser-based navigation enhanced with 3D time-of-flight data
US11774983B1 (en) Autonomous platform guidance systems with unknown environment mapping
KR102249485B1 (en) System and method for autonomously traveling mobile robot
Noaman et al. Landmarks exploration algorithm for mobile robot indoor localization using VISION sensor
CN116079717A (en) Robot control method, robot, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination