WO2019114221A1 - 控制方法、***及所适用的清洁机器人 - Google Patents

控制方法、***及所适用的清洁机器人 Download PDF

Info

Publication number
WO2019114221A1
WO2019114221A1 PCT/CN2018/090659 CN2018090659W WO2019114221A1 WO 2019114221 A1 WO2019114221 A1 WO 2019114221A1 CN 2018090659 W CN2018090659 W CN 2018090659W WO 2019114221 A1 WO2019114221 A1 WO 2019114221A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning robot
cleaning
image
area
underexposed
Prior art date
Application number
PCT/CN2018/090659
Other languages
English (en)
French (fr)
Inventor
陈建军
崔彧玮
李磊
Original Assignee
珊口(上海)智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 珊口(上海)智能科技有限公司 filed Critical 珊口(上海)智能科技有限公司
Priority to US16/131,335 priority Critical patent/US10293489B1/en
Publication of WO2019114221A1 publication Critical patent/WO2019114221A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4063Driving means; Transmission means therefor
    • A47L11/4066Propulsion of the whole machine

Definitions

  • the present application relates to the field of intelligent robots, and in particular to a control method, system and applicable cleaning robot.
  • VSLAM Visual Simultaneous Localization and Mapping, VSLAM based on visual information
  • cleaning robots typically perform cleaning operations while repeatedly performing cleaning while the robot is moving in a predetermined movement mode.
  • the cleaning robot moves to a concentrated area such as under the bed, under the sofa, etc.
  • the user sometimes does not want to clean the area due to excessive dust accumulation in the area, or the user desires to clean the other areas.
  • the area is cleaned.
  • the purpose of the present application is to provide a control method, a system, and a cleaning robot that are applicable to solve the problem that the prior art cannot clean according to the user's needs.
  • a first aspect of the present application provides a control method of a cleaning robot, wherein the cleaning robot includes an image pickup device, a moving system, and a cleaning system, and the control method includes the following steps: cleaning Controlling, by the robot navigation operation environment, the image capturing device to capture an image in real time; analyzing the captured at least one image; controlling the behavior of the cleaning robot according to the result of the analysis and according to a preset cleaning mode; The cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • a second aspect of the present application also provides a control system for a cleaning robot, wherein the cleaning robot includes a camera device, a mobile system, and a cleaning system, including: a storage device that stores one or more programs; a processing device, and the Storing the storage device, by performing the one or more programs to perform the following steps: controlling the camera device to capture an image in real time in a cleaning robot navigation operation environment; analyzing the captured at least one image; The result of the analysis and controlling the behavior of the cleaning robot in accordance with a preset cleaning mode; wherein the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • a third aspect of the present application provides a cleaning robot, comprising: a control system according to any one of the preceding claims, for outputting a control instruction corresponding to a preset cleaning mode based on an analysis result of the captured image; Connected to the control system for taking images for processing by the control system; a mobile system coupled to the control system for driving movement of the cleaning robot based on the control command; a cleaning system, and the control system Connected for performing a cleaning operation based on the control command.
  • a fourth aspect of the present application also provides a storage medium of a computer device, the storage medium storing at least one program, the program being executed by a processor to perform any of the methods described above.
  • control method and system of the present application and the applicable cleaning robot have the following beneficial effects: by using the analysis of the captured image, controlling the behavior of the cleaning robot based on the result of the analysis and according to a preset cleaning mode.
  • the technical solution makes it possible for the user to clean according to the requirements for certain specific areas, and to achieve the purpose of cleaning according to the user's needs.
  • FIG. 1 is a schematic view showing the structure of a control system of a cleaning robot of the present application in an embodiment.
  • Figure 2 shows a schematic view of one embodiment of the cleaning robot of the present application for an underexposed area.
  • Figure 3 shows a schematic view of another embodiment of the cleaning robot of the present application for an underexposed area.
  • FIG. 4 is a flow chart showing a control method of the cleaning robot of the present application in an embodiment.
  • FIG. 5 shows a schematic structural view of a cleaning robot of the present application in an embodiment.
  • the cleaning robot is a device that automatically cleans the area to be cleaned by inhaling debris (for example, dust) from the floor of the area to be cleaned while traveling in the area to be cleaned without user control. Based on the visual information provided by the vision sensor and combined with the movement data provided by other mobile sensors, the cleaning robot can construct the map data of the robot's site on the one hand, and provide route planning and route planning based on the constructed map data on the other hand. Adjustment and navigation services make cleaning robots more efficient.
  • the visual sensor includes an imaging device as an example, and the corresponding visual information is image data (hereinafter simply referred to as an image). Examples of the motion sensor include a speed sensor, an odometer sensor, a distance sensor, a cliff sensor, and the like. However, in the case where the cleaning robot moves to a concentrated area such as under the bed, under the sofa, etc., the user may not want to clean the area due to excessive dust accumulation in the area, or the user wishes to clean the other areas after the cleaning The area is cleaned.
  • the present application provides a control system for a cleaning robot.
  • the cleaning robot comprises a camera device, a mobile system and a cleaning system.
  • the camera device is coupled to a control system of the cleaning robot for capturing images for processing by the control system.
  • the image pickup apparatus includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the cleaning robot, and the camera device starts capturing images during the power-on movement of the robot.
  • the camera device may be provided on the body of the cleaning robot.
  • the camera device may be disposed at a middle or edge of the top cover of the cleaning robot, or the camera device may be disposed below the plane of the top surface of the cleaning robot, near the geometric center of the body or near the edge of the body.
  • the camera device can be located on the top surface of the cleaning robot and the field of view optical axis of the camera device is ⁇ 30° with respect to the vertical.
  • the camera device is located at an intermediate position or edge of the top surface of the cleaning robot, and its optical axis is at an angle of -30°, -29°, -28°, -27°...-1° with respect to the vertical line. 0°, 1°, 2°...29°, or 30°.
  • the angle between the optical axis and the vertical line is only an example, and is not limited to the range of the angle accuracy of 1°. According to the design requirements of the actual robot, the angle is included.
  • the accuracy can be higher, such as 0.1 °, 0.01 °, etc., and no endless examples are given here.
  • the mobile system is coupled to a control system of the cleaning robot that drives cleaning robot movement based on control commands output by the control system.
  • the mobile system includes a drive control device and at least two roller sets.
  • the at least one of the at least two roller groups is a controlled roller group.
  • the drive control device is coupled to the control system, and the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the drive control device includes a drive motor.
  • the drive motor is coupled to the roller set for direct drive roller set rolling.
  • the drive control device may also include one or more processors (such as a CPU or a Micro Processing Unit (MCU)) dedicated to controlling the drive motor.
  • the micro processing unit is configured to convert a control command output by the control system into an electric signal for controlling a driving motor, and control a rotation speed, a steering, and the like of the driving motor according to the electric signal to drive the cleaning robot to move.
  • the processor in the drive control device may be shared with a processor in the control system or may be independently set.
  • the processor in the drive control device functions as a slave processing device, and the processor in the control system functions as a master device, and the drive control device performs motion control based on control of the control system.
  • the processor in the drive control device is shared with a processor in the control system.
  • the drive control device receives the control command output by the control system through the program interface.
  • the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the cleaning system is coupled to a control system of the cleaning robot that performs a cleaning operation based on a control command output by the control system.
  • the cleaning system includes a cleaning assembly and a cleaning drive control assembly.
  • the cleaning drive control assembly is coupled to the control system, the cleaning drive control assembly driving the cleaning assembly to clean the ground based on a control command output by the control system.
  • the cleaning assembly may include a roller brush assembly, a filter screen, a scrubbing assembly, a suction duct, a dust box (or a garbage box), a suction motor, and the like.
  • the roller brush assembly and the scrubbing assembly may be configured or configured in accordance with the actual design of the cleaning robot.
  • the roller brush assembly includes, but is not limited to, a side brush, a side brush driver, a roller, a roller driver, and the like.
  • the scrubbing assembly includes, but is not limited to, a water holding container, a wiping cloth, an assembly structure of the cloth, a driver of the assembled structure, and the like.
  • the cleaning drive control assembly can include one or more processors (such as a CPU or micro processing unit (MCU)) dedicated to controlling the cleaning assembly.
  • the processor in the cleaning drive control assembly can be shared with the processor in the control system or can be independently set.
  • the processor in the cleaning drive control assembly functions as a slave processing device, the processor in the control system as a master device, and the cleaning drive control component performs a cleaning operation based on a control command output by the control system.
  • the processor in the cleaning drive control assembly is shared with the processor in the control system.
  • FIG. 1 is a schematic structural view of a control system of a cleaning robot of the present application in an embodiment.
  • the control system of the cleaning robot of the present application includes a storage device 11 and a processing device 13.
  • the storage device 11 stores one or more programs.
  • the program includes a corresponding program described later by the processing device 13 to perform the steps of control, analysis, and the like.
  • the storage device includes, but is not limited to, a high speed random access memory, a non-volatile memory.
  • a non-volatile memory For example, one or more disk storage devices, flash devices, or other non-volatile solid state storage devices.
  • the storage device may also include a memory remote from one or more processors, such as network attached storage accessed via RF circuitry or external ports and a communication network (not shown), where the communication network may It is the Internet, one or more intranets, a local area network (LAN), a wide area network (WLAN), a storage area network (SAN), etc., or a suitable combination thereof.
  • the memory controller can control access to the storage device by other components of the robot, such as the CPU and peripheral interfaces.
  • the processing device 13 is coupled to the storage device 11 and is capable of data communication with the camera device, the mobile system, and the cleaning system described above.
  • Processing device 13 may include one or more processors.
  • Processing device 13 is operatively coupled to volatile memory and/or non-volatile memory in storage device 11.
  • the processing device may execute instructions stored in the memory and/or the non-volatile storage device to perform operations in the robot, such as analyzing the captured image and controlling the behavior of the cleaning robot based on the analysis result, and the like.
  • the processor may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more digital signal processors (DSPs), one or more field programmable logic arrays (FPGAs) , or any combination of them.
  • ASICs application specific processors
  • DSPs digital signal processors
  • FPGAs field programmable logic arrays
  • the processing device is also operatively coupled to an I/O port and an input structure that enables the robot to interact with various other electronic devices that enable the user to interact with the computing device.
  • the input structure can include buttons, keyboards, mice, trackpads, and the like.
  • the other electronic device may be a mobile motor in the mobile device in the robot, or a slave processor in the robot dedicated to controlling the mobile device, such as an MCU (Microcontroller Unit, MCU for short).
  • MCU Microcontroller Unit
  • the processing device is coupled to the storage device and the camera device, the mobile system, and the cleaning system, respectively, via data lines.
  • the processing device interacts with the storage device through a data reading and writing technology, and the processing device interacts with the camera device, the mobile system, and the cleaning system through an interface protocol.
  • the data reading and writing technology includes but is not limited to: a high speed/low speed data interface protocol, a database read and write operation, and the like.
  • the interface protocols include, but are not limited to, an HDMI interface protocol, a serial interface protocol, and the like.
  • the processing device 13 performs the following steps by retrieving a program stored in the storage device 11 to: in the cleaning robot navigation operating environment, controlling the camera device to capture an image in real time; analyzing the captured at least one image; The result of the analysis controls the behavior of the cleaning robot in accordance with a preset cleaning mode; wherein the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • the navigation operation environment refers to an environment in which the robot moves and performs corresponding operations according to the current positioning and the navigation route determined based on the current positioning.
  • the navigation operating environment of the cleaning robot refers to an environment in which the cleaning robot moves according to the navigation route and performs a cleaning operation.
  • the processing device 13 controls the camera to capture an image in real time in the cleaning robot navigation operating environment.
  • the camera device may be a camera for taking still images or videos.
  • the cleaning robot may preset a time interval of capturing an image according to a navigation operation environment, and then the processing device controls the imaging device to capture an image at a preset time interval to acquire a still image at different times.
  • the processing device controls the camera to capture a video.
  • the processing device 13 then analyzes the captured at least one image.
  • the processing device may analyze at least one of the acquired still images.
  • the processing device may firstly acquire the image frames in the acquired video continuously or discontinuously, and then select one frame image as one image. analysis.
  • the processing device can analyze one or more images.
  • the processing device 13 controls the behavior of the cleaning robot based on the result of the analysis and in accordance with a preset cleaning mode; wherein the cleaning mode includes a cleaning mode corresponding to the underexposed region.
  • the result of the analysis includes that the cleaning robot is located in the underexposed area, or the robot is not located in the underexposed area.
  • the underexposed area is a dark area formed by an object covering a light source (the sun or a spotlight, etc.) such that the light cannot pass through the opaque object, such as under a bed, under a sofa, or the like.
  • the underexposed area may also refer to a situation in which the amount of light entering the imaging device is too small when the cleaning robot is located in an area where the light intensity is too weak, and the brightness of the captured image is lower than the preset brightness threshold. That is, the image is underexposed.
  • the brightness included in the captured image may be described by an image gray value. For example, when the processing device detects an area in the image that includes a grayscale value smaller than a preset grayscale threshold, the image is determined to be an underexposed image. In order to determine that the cleaning robot is located in the underexposed area.
  • the brightness may also be described by a light intensity value provided by a light sensor in the camera device, for example, the processing device acquires an image and corresponding light intensity data when the light intensity data is less than a pre-light
  • the processing device determines that the cleaning robot is located in the underexposed area.
  • the processing device determines whether the cleaning robot is located in the underexposed region based on the grayscale values and light intensity data in the image. For example, the processing device determines that the cleaning robot is located in the underexposed region by simultaneously satisfying two of the above two examples.
  • the preset cleaning mode may be preset and stored by a storage device in the cleaning robot.
  • the cleaning mode can also be obtained by defining a cleaning area of the robot in advance.
  • the user can set the cleaning area according to different classification methods. For example, depending on the underexposure, the processing device can set the cleaning area to include an underexposed area and a non-underexposure area.
  • the user can set different cleaning areas for the cleaning robot, including the room area, the living room area, the kitchen area, etc., and the areas between different categories may overlap, for example, under the bed in the living room area, under the sofa. Etc. may belong to the underexposed area, while other areas may belong to the non-underexposed area.
  • the user can input a cleaning mode corresponding to each cleaning area to the cleaning robot.
  • the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • the cleaning mode corresponding to the underexposed area may be set to stop cleaning if the cleaning robot is in the underexposed area, or to lag the cleaning if the cleaning robot is in the underexposed area, or in the case where the cleaning robot is in the underexposed area Focus on cleaning, or keep cleaning if the cleaning robot is in an under-exposed area.
  • the cleaning means may include cleaning, vacuuming, wiping, or any combination thereof.
  • the hysteresis cleaning may be, for example, without cleaning the cleaning robot and continuing to clean other areas in the case where the cleaning robot is located in the under-exposed area, and returning to the cleaning under-exposure area after the other areas are cleaned.
  • the focused cleaning may be at least one of a means of adjusting a cleaning mode of the cleaning assembly, such as increasing the rotational speed of the roller brush assembly, spraying the liquid through the liquid applicator, increasing the pressure of the scrubbing assembly, and increasing the suction force of the vacuum air passage. Enhance the cleaning effect.
  • the cleaning mode also includes other cleaning modes, for example, continuous cleaning without the cleaning robot being in the underexposed area.
  • the processing device in the cleaning robot may determine a cleaning mode corresponding to the underexposed region based on user input.
  • the cleaning robot can also comprise a human-machine interaction device, which is also connected to the processing device. The user can directly input the cleaning mode corresponding to each underexposed area on the human-machine interaction device provided by the cleaning robot.
  • the cleaning robot includes a network device connected to the processing device, and the user's other intelligent terminals (such as a mobile phone, a tablet computer, a personal computer, etc.) can transmit data through the network device and the processing device, and the user operates the other intelligent terminal.
  • the input cleaning mode corresponding to each underexposed area is transmitted to the processing device, and the corresponding correspondence is stored by the processing device into the storage device of the cleaning robot.
  • the underexposed area may be pre-calibrated; or obtained by image analysis via a cleaning robot; or determined in combination with pre-calibration and image analysis.
  • Each under-exposed area may correspond to a uniform cleaning mode, or a separate cleaning mode, or a cleaning mode according to the classification of the under-exposed area.
  • the behavior of the cleaning robot may include moving along the original navigation route and continuously cleaning when the cleaning robot is in the under-exposed area, moving along the original navigation route and stopping cleaning, moving along the original navigation route, focusing on cleaning, moving along the original navigation route, and lag Clean, modify the navigation route and keep it clean.
  • the behavior of the cleaning robot may also include continued cleaning along the original navigation route when the cleaning robot is not in the underexposed area, and the like.
  • cleaning mode and the behavior of the cleaning robot are merely examples, and are not limited to the manner in which the cleaning mode of the present application and the behavior of the cleaning robot are performed. In fact, technicians can set other modes of cleaning mode and cleaning robot behavior according to the type of cleaning robot, user requirements, etc., and will not be described here.
  • the control system of the cleaning robot of the present application analyzes the image captured by the camera device by using the processing device, and controls the technical solution of the behavior of the cleaning robot according to the result of the analysis and according to the preset cleaning mode, so that for certain specific regions, Users can clean according to their needs, and achieve the purpose of cleaning according to the needs of users.
  • the processing means performing the step of analyzing the captured at least one image comprises determining the current position of the cleaning robot using the positioning features identified in the at least one image.
  • the positioning features include, but are not limited to, shape features, grayscale features, and the like.
  • the shape features include, but are not limited to, corner features, line features, edge features, curved features, and the like.
  • the grayscale features include, but are not limited to, a grayscale transition feature, a grayscale value above or below a grayscale threshold, an area size in a image frame that includes a predetermined grayscale range, and the like.
  • the number of positioning features that can be identified in the image is usually plural, for example, more than ten.
  • the implementation of determining the current position of the cleaning robot by the processing device using the positioning features identified by the at least one image for example, by identifying the graphic of the physical object in the captured image and performing the graphic with the standard Matching, and determining positioning information of the robot in the current physical space based on standard physical characteristics of the standard.
  • the positioning information of the robot in the current physical space is determined by matching the features identified in the image with the features in the landmark information in the preset map data.
  • an implementation for determining, by the processing device, a current position of the cleaning robot using the positioning features identified by the at least two images for example, using positional offset information of the matched feature points in the two image frames The position and posture of the robot.
  • the processing device positions the cleaning robot using positioning features identified in the at least one image and pre-established landmark information.
  • the landmark information may be an attribute information collected during previous navigation and corresponding to the anchor point in the map data, including but not limited to: the positioning that the camera device can capture at a certain positioning point of the map data.
  • the feature information, the map data in the physical space when the positioning feature is photographed, the position in the corresponding image frame when the positioning feature is photographed in the past, and the position and posture of the cleaning robot when the corresponding positioning feature is photographed are attribute information.
  • the landmark information may be stored with the map data in the storage device.
  • the map data may be pre-built based on SLAM (Simultaneous Localization and Mapping) or VSLAM technology.
  • the processing device performs positioning analysis using the captured at least one image, and does not necessarily have timing constraints, and further performs analyzing whether the cleaning robot is located in the underexposed region by using the grayscale feature in the at least one image. step.
  • the cleaning robot when the cleaning robot is located in the underexposed area, the cleaning robot is usually located under the bed, under the sofa, and the like. In the case where the cleaning robot is located under the bed or under the sofa, the user may not want to clean the area due to excessive dust under the bed or under the sofa, or the user may want to clean the area after cleaning other areas.
  • the processing device analyzes whether the robot is in an underexposed area.
  • the processing device performs underexposure region analysis by acquiring an image taken by the imaging device.
  • the image captured by the image capturing device is usually in the RGB color mode, so the processing device needs to perform grayscale processing on the captured image to obtain a grayscale image, and then perform underexposure analysis on the grayscale image to determine whether the cleaning robot is located.
  • Underexposure area the processing device may perform gradation processing on the captured image by using a component method, a maximum value method, an average value method, or a weighted average method to obtain a grayscale image.
  • a grayscale image is a monochrome image with 256 levels of grayscale or gradation from black to white, with 255 representing white and 0 representing black.
  • the processing device determines whether the cleaning robot is in the underexposed region by analyzing the grayscale distribution, the grayscale mean, the grayscale minimax, and the like in the grayscale image.
  • the processing device analyzes the grayscale distribution in the grayscale image to obtain the grayscale distribution concentrated in the preset grayscale underexposure interval.
  • the device determines that the cleaning robot is in the underexposed area.
  • the grayscale underexposure interval can be known according to technical experience or experimental design and stored in advance in the cleaning robot.
  • the camera device is selected as an imaging device that includes an adjustable aperture.
  • the processing device can change the intensity of the light acquired by the camera by increasing the amount of light entering. After compensating for the amount of incoming light, the processing device analyzes the image captured by the adjusted imaging device.
  • the aperture is usually disposed in the imaging device for adjusting the amount of light entering the imaging device. For example, when the imaging device is located in the underexposed region, the amount of light entering can be automatically increased by increasing the aperture. Then, the processing device performs the above-described underexposure analysis on at least one image captured by the adjusted imaging device to improve the analysis result. accuracy.
  • the processing device analyzes the captured image in a timely manner in response to the cleaning mode corresponding to the underexposed region, in some practical applications, since the imaging device automatically adjusts the aperture under different light intensities, when the light intensity changes There may be cases where the grayscale of the captured image is generally low because the aperture adjustment is not timely.
  • the underexposure analysis of multiple images can be performed to improve the accuracy of the underexposure analysis.
  • the processing device may analyze a plurality of images taken within the preset time period to determine whether the cleaning robot is located in the underexposed region.
  • the preset duration can be known according to technical experience or experimental design and stored in the cleaning robot in advance. Generally, the preset duration can be on the order of milliseconds. For example, when the processing device detects that an image is underexposed, it continues to detect that at least one image captured by the subsequent camera device also exhibits an underexposure characteristic, thereby determining that the robot is located in the underexposed region.
  • the processing device can also determine whether the cleaning robot is in an underexposed region by analyzing the photosensitive information from the photosensitive member.
  • the photosensitive element includes, but is not limited to, a photoresistor, a phototransistor, a photomultiplier tube, a CCD element, a CMOS device, and the like. Furthermore, the photosensitive element may be provided on the body of the cleaning robot.
  • the photosensitive element may be disposed at a front edge of the top cover of the cleaning robot in the traveling direction, or a plurality of photosensitive elements may be spaced apart at an edge of the top cover of the cleaning robot such that when a part of the cleaning robot enters the owing When the area is exposed, it is determined that the cleaning robot is located in the underexposed area.
  • the photosensitive element converts the sensed light intensity into light intensity data output to the processing device, and then the processing device analyzes the light intensity data to determine if the cleaning robot is in the underexposed region.
  • the processing device presets an underexposure threshold, and when the obtained light intensity data is lower than the underexposure threshold, the processing device determines that the cleaning robot is located in the underexposed region; otherwise, determines that the cleaning robot is not located in the underexposed region.
  • the underexposure threshold can be known according to technical experience or experimental design and stored in the cleaning robot in advance.
  • determining whether the cleaning robot is located in the underexposed area based on one or more of the above analysis methods is merely an example, and is not a limitation on the manner in which the present application determines the location in the underexposed area.
  • the technician can also use a variety of grayscale analysis methods combined with the results obtained by the photosensitive element method to further evaluate whether the cleaning robot is located in the underexposed area to determine whether the cleaning robot is located in the underexposed area. It will not be described one by one here.
  • the analysis based on any of the image gray values mentioned in the present application, or on the basis of which the determined cleaning robot is located in the underexposed area should be regarded as a specific example of the present application.
  • the processing device then performs the step of controlling the behavior of the cleaning robot based on the results of the analysis and in accordance with a preset cleaning mode.
  • the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • the processing device controls the movement of the mobile system to the underexposed area based on the pre-planned route
  • the corresponding cleaning mode may be employed for the moving and cleaning operations.
  • the manner in which the processing device controls the behavior of the cleaning robot according to the result of the analysis and according to a preset cleaning mode includes any one of the following:
  • FIG. 2 is a schematic diagram showing an embodiment of the cleaning robot of the present application for an underexposed area in an embodiment.
  • the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot according to the original navigation route is shown by the arrow A in the figure. The direction moves and performs a cleaning operation.
  • the processing device of the cleaning robot analyzes that the cleaning robot is located under the under-exposed area, that is, under the bed, according to any of the above methods, modifies the navigation route and controls the mobile system to leave the under-exposed area, in this example.
  • the cleaning robot is moved 180° as indicated by the turning arrow B in the figure and continues to move, so that the cleaning robot leaves the bed and continues to move in the direction indicated by the arrow C and clean other non-underexposure areas.
  • the cleaning robot in FIG. 2 is rotated by 180° as shown by the turning arrow B in the figure, and then the movement is continued as an example.
  • the angle of the deflection is not only 180°. According to actual needs, the angle of the deflection can be flexibly set, and no endless example is given here.
  • FIG. 3 is a schematic diagram showing another embodiment of the cleaning machine for the underexposure area of the present application.
  • the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot according to the original navigation route is along the arrow A in the figure. The direction of the movement is moved and the cleaning operation is performed.
  • the cleaning robot is located under the under-exposed area, that is, under the bed, and on the one hand, the mobile system is controlled according to the original route along the arrow B in the figure.
  • the direction is moved, and the cleaning system is also controlled not to clean the under-exposed area currently located until the cleaning robot is detected to be in the non-under-exposure area, and the mobile system is continuously controlled to follow the original route in the direction indicated by the arrow C in the figure.
  • the operation is moved and the cleaning system is controlled to perform a cleaning operation in accordance with a cleaning mode corresponding to the non-underexposure cleaning area.
  • the preset cleaning mode of the corresponding underexposed area is focused cleaning.
  • the cleaning robot moves and performs the cleaning operation according to the original navigation route in the direction indicated by the arrow A in the figure.
  • the cleaning device of the cleaning robot analyzes that the cleaning robot is located under the under-exposure zone, that is, under the bed, according to any of the above manners, on the one hand, the mobile system is controlled to move in the direction indicated by the arrow B in the figure according to the original route, and also controls the cleaning.
  • the system performs a cleaning operation corresponding to the under-exposed area where the key cleaning is located until the cleaning robot is detected to be in the non-under-exposure area, and continues to control the movement system to move according to the original route in the direction indicated by the arrow C in the figure and control the cleaning system. Perform a regular cleaning operation in accordance with the cleaning mode corresponding to the non-underexposure cleaning area.
  • the processing device controls the cleaning robot to control the cleaning system and the mobile system using the cleaning mode that is not cleaned based only on the grayscale characteristics or the photosensitive information of the photosensitive member. It is not accurate.
  • the storage device stores map data and landmark information, wherein the landmark information includes not only the foregoing attribute information, but also object information represented by the positioning feature, and corresponding underexposed regions. Attributes, etc.
  • the processing device also performs the step of constructing landmark information of the underexposed area in the pre-built map data in the navigation operating environment.
  • the processing device determines that the cleaning robot moves from the non-underexposed region to the underexposed region, or moves from the underexposed region to the non-underexposed region
  • the attribute information is added to the pre-built map data. For example, the processing device determines a boundary position of the underexposed region and the non-underexposed region according to the real-time positioning; then, the processing device increases an attribute including the underexposed region at a boundary position in the map data, or a landmark existing existing near the boundary position The attribute of the underexposed area is added to the information.
  • the indoor display and the intake of the light source may change.
  • the processing device also performs the step of updating the landmark information of the corresponding underexposed area in the pre-built map data.
  • the processing device updates the corresponding landmark information when the cleaning robot is in the underexposed area and the corresponding landmark information is an attribute of the non-underexposed area after performing the underexposure analysis by the image or the photosensitive device, or the processing device passes the image or the light sensitive image. After the device performs the underexposure analysis and determines that the cleaning robot is in the non-underexposed area and the corresponding landmark information is an attribute of the underexposed area, the corresponding landmark information is updated.
  • the processing device navigates the cleaning robot based on a preset cleaning mode, positioning information identified in at least one image, and the landmark information.
  • the landmark information includes an attribute of an underexposed area or an attribute of a non-underexposed area.
  • the processing device constructs a cleaning mode that does not include an underexposed area according to a predetermined cleaning mode in which the underexposed area is not cleaned, the non-underexposure area is cleaned, and the landmark information in the map data including the underexposed area attribute. Cleaning the route; determining the current position of the cleaning robot at least one image taken along the cleaning route, and controlling the movement system to move and clean along the cleaning route, thereby not cleaning the underexposed area.
  • the processing device constructs a clean cleaning mode according to a preset under-exposure region lag key cleaning, a cleaning mode in which the non-underexposure region is cleaned, and landmark information in the map data including each attribute of the underexposed region.
  • the trailing end of the route moves to the beginning of the third segment of the route, and the cleaning system is controlled not to perform the cleaning operation during the movement along the second segment; then, the processing device moves along the third segment in accordance with the preset focus cleaning mode Control the cleaning system to perform the appropriate cleaning operations during the period.
  • the current position and navigation of the cleaning robot may be located using only the movement data provided by the movement sensor in the cleaning robot.
  • the processing device controls the behavior of the cleaning robot according to the cleaning mode of the corresponding room. For example, when the processing device moves to the unlit room entrance according to the landmark information, it is located in the underexposed area based on image analysis, and the processing device can identify the positioning feature from the cached image before entering the room, and from Obtaining a gate attribute in the landmark information including the identified positioning feature, thereby determining that the underexposed area is a room requiring cleaning, and therefore, the processing device calculates the moved distance and the current position by using the movement data provided by the movement sensor in the cleaning robot, and The mobile system in the cleaning robot is controlled to move in the room according to the original navigation route, and the processing device also controls the cleaning system of the cleaning robot to clean the floor and the wall of the room.
  • the processing device can also output prompt information after entering the underexposed area.
  • the processing device may alert the user that the cleaning robot has entered the underexposed area by issuing an audible alarm or by sending a message to the user's mobile terminal after entering the underexposed area.
  • the application also provides a control method of the cleaning robot.
  • the cleaning robot includes a camera device, a mobile system, and a cleaning system.
  • the camera device is coupled to a control system of the cleaning robot for capturing images for processing by the control system.
  • the image pickup apparatus includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the cleaning robot, and the camera device starts capturing images during the power-on movement of the robot. Further, the image pickup device may be provided on the main body of the cleaning robot.
  • the camera device may be disposed at a middle or edge of the top cover of the cleaning robot, or the camera device may be disposed below the plane of the top surface of the cleaning robot, near the geometric center of the body or near the edge of the body.
  • the camera device can be located on the top surface of the cleaning robot and the field of view optical axis of the camera device is ⁇ 30° with respect to the vertical.
  • the camera device is located at an intermediate position or edge of the top surface of the cleaning robot, and its optical axis is at an angle of -30°, -29°, -28°, -27°...-1° with respect to the vertical line. 0°, 1°, 2°...29°, or 30°.
  • the angle between the optical axis and the vertical line is only an example, and is not limited to the range of the angle accuracy of 1°. According to the design requirements of the actual robot, the angle is included.
  • the accuracy can be higher, such as 0.1 °, 0.01 °, etc., and no endless examples are given here.
  • the mobile system is coupled to a control system of the cleaning robot that drives cleaning robot movement based on control commands output by the control system.
  • the mobile system includes a drive control device and at least two roller sets.
  • the at least one of the at least two roller groups is a controlled roller group.
  • the drive control device is coupled to the control system, and the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the drive control device includes a drive motor.
  • the drive motor is coupled to the roller set for direct drive roller set rolling.
  • the drive control device may also include one or more processors (such as a CPU or a Micro Processing Unit (MCU)) dedicated to controlling the drive motor.
  • the micro processing unit is configured to convert a control command output by the control system into an electric signal for controlling a driving motor, and control a rotation speed, a steering, and the like of the driving motor according to the electric signal to drive the cleaning robot to move.
  • the processor in the drive control device may be shared with a processor in the control system or may be independently set.
  • the processor in the drive control device functions as a slave processing device, and the processor in the control system functions as a master device, and the drive control device performs motion control based on control of the control system.
  • the processor in the drive control device is shared with a processor in the control system.
  • the drive control device receives the control command output by the control system through the program interface.
  • the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the cleaning system is coupled to a control system of the cleaning robot that performs a cleaning operation based on a control command output by the control system.
  • the cleaning system includes a cleaning assembly and a cleaning drive control assembly.
  • the cleaning drive control assembly is coupled to the control system, the cleaning drive control assembly driving the cleaning assembly to clean the ground based on a control command output by the control system.
  • the cleaning assembly may include a roller brush assembly, a filter screen, a scrubbing assembly, a suction duct, a dust box (or a garbage box), a suction motor, and the like.
  • the roller brush assembly and the scrubbing assembly may be configured or configured in accordance with the actual design of the cleaning robot.
  • the roller brush assembly includes, but is not limited to, a side brush, a side brush driver, a roller, a roller driver, and the like.
  • the scrubbing assembly includes, but is not limited to, a water holding container, a wiping cloth, an assembly structure of the cloth, a driver of the assembled structure, and the like.
  • the cleaning drive control assembly can include one or more processors (such as a CPU or microprocessor (MCU)) dedicated to controlling the cleaning assembly.
  • the processor in the cleaning drive control assembly can be shared with the processor in the control system or can be independently set.
  • the processor in the cleaning drive control assembly functions as a slave processing device, the processor in the control system as a master device, and the cleaning drive control component performs a cleaning operation based on a control command output by the control system.
  • the processor in the cleaning drive control assembly is shared with the processor in the control system.
  • FIG. 4 is a flow chart showing a control method of the cleaning robot of the present application in an embodiment.
  • the control method is primarily performed by a control system.
  • the control system can be configured in a cleaning robot, as shown in Figure 1 and its description, or other control system capable of performing the control method.
  • the control method includes step S110, step S120, and step S130.
  • step S110, step S120, and step S130 can be implemented by the processing device.
  • the processing device can include one or more processors.
  • the processing device is operatively coupled to the volatile memory and/or the non-volatile memory in the storage device.
  • the processing device may execute instructions stored in the memory and/or the non-volatile storage device to perform operations in the robot, such as analyzing the captured image and controlling the behavior of the cleaning robot based on the analysis result, and the like.
  • the processor may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more digital signal processors (DSPs), one or more field programmable logic arrays (FPGAs) , or any combination of them.
  • ASICs application specific processors
  • DSPs digital signal processors
  • FPGAs field programmable logic arrays
  • the processing device is also operatively coupled to an I/O port and an input structure that enables the robot to interact with various other electronic devices that enable the user to interact with the computing device.
  • the input structure can include buttons, keyboards, mice, trackpads, and the like.
  • the other electronic device may be a mobile motor in the mobile device in the robot, or a slave processor in the robot dedicated to controlling the mobile device, such as an MCU (Microcontroller Unit, MCU for short).
  • MCU Microcontroller Unit
  • the processing device is coupled to the storage device and the camera device, the mobile system, and the cleaning system, respectively, via data lines.
  • the processing device interacts with the storage device through a data reading and writing technology, and the processing device interacts with the camera device, the mobile system, and the cleaning system through an interface protocol.
  • the data reading and writing technology includes but is not limited to: a high speed/low speed data interface protocol, a database read and write operation, and the like.
  • the interface protocols include, but are not limited to, an HDMI interface protocol, a serial interface protocol, and the like.
  • step S110 the camera is controlled to capture an image in real time in the cleaning robot navigation operating environment.
  • the processing device controls the camera to capture an image in real time in a cleaning robot navigation operation environment.
  • the camera device may be a camera for taking still images or videos.
  • the cleaning robot may preset a time interval of capturing an image according to a navigation operation environment, and then the processing device controls the imaging device to capture an image at a preset time interval to acquire a still image at different times.
  • the processing device controls the camera to capture a video.
  • the navigation operation environment refers to an environment in which the robot moves and performs corresponding operations according to the current positioning and the navigation route determined based on the current positioning.
  • the navigation operating environment of the cleaning robot refers to an environment in which the cleaning robot moves according to the navigation route and performs a cleaning operation.
  • step S120 at least one image taken is analyzed.
  • the processing device analyzes at least one of the captured images.
  • the processing device may analyze at least one of the acquired still images.
  • the processing device may firstly acquire the image frames in the acquired video continuously or discontinuously, and then select one frame image as one image. analysis.
  • the processing device can analyze one or more images.
  • step S130 the behavior of the cleaning robot is controlled based on the result of the analysis and according to a preset cleaning mode; wherein the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • the processing device controls the behavior of the cleaning robot based on the result of the analysis and in accordance with a preset cleaning mode.
  • the result of the analysis includes that the cleaning robot is located in the underexposed area, or the robot is not located in the underexposed area.
  • the underexposed area is a dark area formed by an object covering a light source (the sun or a spotlight, etc.) such that the light cannot pass through the opaque object, such as under a bed, under a sofa, or the like.
  • the underexposed area may also refer to a situation in which the amount of light entering the imaging device is too small when the cleaning robot is located in an area where the light intensity is too weak, and the brightness of the captured image is lower than the preset brightness threshold.
  • the brightness included in the captured image may be described by an image gray value.
  • the processing device detects an area in the image that includes a grayscale value smaller than a preset grayscale threshold, the image is determined to be an underexposed image. In order to determine that the cleaning robot is located in the underexposed area.
  • the brightness may also be described by a light intensity value provided by a light sensor in the camera device, for example, the processing device acquires an image and corresponding light intensity data when the light intensity data is less than a pre-light When the light intensity threshold is set, the processing device determines that the cleaning robot is located in the underexposed area.
  • the processing device determines whether the cleaning robot is located in the underexposed region based on the grayscale values and light intensity data in the image. For example, the processing device determines that the cleaning robot is located in the underexposed region by simultaneously satisfying two of the above two examples.
  • the preset cleaning mode may be preset and stored by a storage device in the cleaning robot.
  • the cleaning mode can also be obtained by defining a cleaning area of the robot in advance.
  • the user can set the cleaning area according to different classification methods. For example, depending on the underexposure, the processing device can set the cleaning area to include an underexposed area and a non-underexposure area.
  • the user can set different cleaning areas for the cleaning robot, including the room area, the living room area, the kitchen area, etc., and the areas between different categories may overlap, for example, under the bed in the living room area, under the sofa. Etc. may belong to the underexposed area, while other areas may belong to the non-underexposed area.
  • the user can input a cleaning mode corresponding to each cleaning area to the cleaning robot.
  • the cleaning mode includes a cleaning mode corresponding to the underexposed area.
  • the cleaning mode corresponding to the underexposed area may be set to stop cleaning if the cleaning robot is in the underexposed area, or to lag the cleaning if the cleaning robot is in the underexposed area, or in the case where the cleaning robot is in the underexposed area Focus on cleaning, or keep cleaning if the cleaning robot is in an under-exposed area.
  • the cleaning means may include cleaning, vacuuming, wiping, or any combination thereof.
  • the hysteresis cleaning may be, for example, without cleaning the cleaning robot and continuing to clean other areas if the cleaning robot is located in the under-exposed area, and returning to the cleaning under-exposure area after the cleaning of the other areas is completed.
  • the focused cleaning may be at least one of a means of adjusting a cleaning mode of the cleaning assembly, such as increasing the rotational speed of the roller brush assembly, spraying the liquid through the liquid applicator, increasing the pressure of the scrubbing assembly, and increasing the suction force of the vacuum air passage. Enhance the cleaning effect.
  • the cleaning mode also includes other cleaning modes, for example, continuous cleaning without the cleaning robot being in the underexposed area.
  • the processing device in the cleaning robot may determine a cleaning mode corresponding to the underexposed region based on user input.
  • the cleaning robot can also comprise a human-machine interaction device, which is also connected to the processing device. The user can directly input the cleaning mode corresponding to each underexposed area on the human-machine interaction device provided by the cleaning robot.
  • the cleaning robot includes a network device connected to the processing device, and the user's other intelligent terminals (such as a mobile phone, a tablet computer, a personal computer, etc.) can transmit data through the network device and the processing device, and the user operates the other intelligent terminal.
  • the input cleaning mode corresponding to each underexposed area is transmitted to the processing device, and the corresponding correspondence is stored by the processing device into the storage device of the cleaning robot.
  • the underexposed area may be pre-calibrated; or obtained by image analysis via a cleaning robot; or determined in combination with pre-calibration and image analysis.
  • Each under-exposed area may correspond to a uniform cleaning mode, or a separate cleaning mode, or a cleaning mode according to the classification of the under-exposed area.
  • the behavior of the cleaning robot may include moving along the original navigation route and continuously cleaning when the cleaning robot is in the under-exposed area, moving along the original navigation route and stopping cleaning, moving along the original navigation route, focusing on cleaning, moving along the original navigation route, and lag Clean, modify the navigation route and keep it clean.
  • the behavior of the cleaning robot may also include continued cleaning along the original navigation route when the cleaning robot is not in the underexposed area, and the like.
  • cleaning mode and the behavior of the cleaning robot are merely examples, and are not limited to the manner in which the cleaning mode of the present application and the behavior of the cleaning robot are performed. In fact, technicians can set other modes of cleaning mode and cleaning robot behavior according to the type of cleaning robot, user requirements, etc., and will not be described here.
  • the control method of the cleaning robot of the present application by analyzing the captured image, controlling the technical solution of cleaning the behavior of the cleaning robot according to the result of the analysis and according to the preset cleaning mode, so that for certain specific areas, the user can perform according to the demand. Clean, to achieve the purpose of cleaning according to the needs of users.
  • the step of analyzing the captured at least one image in step S120 comprises determining the current position of the cleaning robot using the positioning features identified in the at least one image.
  • the positioning features include, but are not limited to, shape features, grayscale features, and the like.
  • the shape features include, but are not limited to, corner features, line features, edge features, curved features, and the like.
  • the grayscale features include, but are not limited to, a grayscale transition feature, a grayscale value above or below a grayscale threshold, an area size in a image frame that includes a predetermined grayscale range, and the like.
  • the number of positioning features that can be identified in the image is usually plural, for example, more than ten.
  • an implementation for determining a current position of the cleaning robot using the positioning features identified in the at least one image for example, by identifying a graphic of the physical object in the captured image and performing a graphic with the standard Matching, and determining positioning information of the robot in the current physical space based on standard physical characteristics of the standard.
  • the positioning information of the robot in the current physical space is determined by matching the features identified in the image with the features in the landmark information in the preset map data.
  • an implementation for determining a current position of the cleaning robot using the positioning features identified in the at least two images for example, utilizing positional offset information of the matched feature points in the two image frames to determine the robot's Position and posture.
  • the cleaning robot is positioned using positioning features identified in at least one image and pre-established landmark information.
  • the landmark information may be an attribute information collected during previous navigation and corresponding to the anchor point in the map data, including but not limited to: the positioning that the camera device can capture at a certain positioning point of the map data.
  • the feature information, the map data in the physical space when the positioning feature is photographed, the position in the corresponding image frame when the positioning feature is photographed in the past, and the position and posture of the cleaning robot when the corresponding positioning feature is photographed are attribute information.
  • the landmark information may be stored with the map data in the storage device.
  • the map data may be pre-built based on SLAM (Simultaneous Localization and Mapping) or VSLAM technology.
  • step S120 when positioning analysis is performed using at least one of the captured images, it is not necessary to have timing constraints, and the grayscale features in the at least one image are used to analyze whether the cleaning robot is located in the underexposed region.
  • the cleaning robot when the cleaning robot is located in the under-exposed area, the cleaning robot is usually located under the bed, under the sofa, and the like. In the case where the cleaning robot is located under the bed or under the sofa, the user may not want to clean the area due to excessive dust under the bed or under the sofa, or the user may want to clean the area after cleaning other areas.
  • the processing device performs underexposure region analysis by acquiring an image taken by the imaging device.
  • the image captured by the image capturing device is usually in the RGB color mode, so the processing device needs to perform grayscale processing on the captured image to obtain a grayscale image, and then perform underexposure analysis on the grayscale image to determine whether the cleaning robot is located.
  • Underexposure area the processing device may perform gradation processing on the captured image by using a component method, a maximum value method, an average value method, or a weighted average method to obtain a grayscale image.
  • a grayscale image is a monochrome image with 256 levels of grayscale or gradation from black to white, with 255 representing white and 0 representing black.
  • the processing device determines whether the cleaning robot is in the underexposed region by analyzing the grayscale distribution, the grayscale mean, the grayscale minimax, and the like in the grayscale image.
  • the processing device analyzes the grayscale distribution in the grayscale image to obtain the grayscale distribution concentrated in the preset grayscale underexposure interval.
  • the device determines that the cleaning robot is in the underexposed area.
  • the grayscale underexposure interval can be known according to technical experience or experimental design and stored in advance in the cleaning robot.
  • the camera device is selected as an imaging device that includes an adjustable aperture.
  • the processing device can change the intensity of the light acquired by the camera by increasing the amount of light entering. After compensating for the amount of incoming light, the processing device analyzes the image taken by the adjusted imaging device.
  • the aperture is usually disposed in the imaging device for adjusting the amount of light entering the imaging device. For example, when the imaging device is located in the underexposed region, the amount of light entering can be automatically increased by increasing the aperture. Then, the processing device performs the above-described underexposure analysis on at least one image captured by the adjusted imaging device to improve the analysis result. accuracy.
  • the processing device analyzes the captured image in a timely manner in response to the cleaning mode corresponding to the underexposed region, in some practical applications, since the imaging device automatically adjusts the aperture under different light intensities, when the light intensity changes There may be cases where the grayscale of the captured image is generally low because the aperture adjustment is not timely.
  • the underexposure analysis of multiple images can be performed to improve the accuracy of the underexposure analysis.
  • the processing device may analyze a plurality of images taken within the preset time period to determine whether the cleaning robot is located in the underexposed region.
  • the preset duration can be known according to technical experience or experimental design and stored in the cleaning robot in advance. Generally, the preset duration can be on the order of milliseconds. For example, when the processing device detects that an image is underexposed, it continues to detect that at least one image captured by the subsequent camera device also exhibits an underexposure characteristic, thereby determining that the robot is located in the underexposed region.
  • the cleaning robot is in an underexposed area by analyzing the photosensitive information from the photosensitive element.
  • the photosensitive element includes, but is not limited to, a photoresistor, a phototransistor, a photomultiplier tube, a CCD element, a CMOS device, and the like. Furthermore, the photosensitive element may be provided on the body of the cleaning robot.
  • the photosensitive element may be disposed at a front edge of the top cover of the cleaning robot in the traveling direction, or a plurality of photosensitive elements may be spaced apart at an edge of the top cover of the cleaning robot such that when a part of the cleaning robot enters the owing When the area is exposed, it is determined that the cleaning robot is located in the underexposed area.
  • the photosensitive element converts the sensed light intensity into light intensity data output to the processing device, and then the processing device analyzes the light intensity data to determine if the cleaning robot is in the underexposed region.
  • the processing device presets an underexposure threshold, and when the obtained light intensity data is lower than the underexposure threshold, the processing device determines that the cleaning robot is located in the underexposed region; otherwise, determines that the cleaning robot is not located in the underexposed region.
  • the underexposure threshold can be known according to technical experience or experimental design and stored in the cleaning robot in advance.
  • determining whether the cleaning robot is located in the underexposed area based on one or more of the above analysis methods is merely an example, and is not a limitation on the manner in which the present application determines the location in the underexposed area.
  • the technician can also use a variety of grayscale analysis methods combined with the results obtained by the photosensitive element method to further evaluate whether the cleaning robot is located in the underexposed area to determine whether the cleaning robot is located in the underexposed area. It will not be described one by one here.
  • the analysis based on any of the image gray values mentioned in the present application, or on the basis of which the determined cleaning robot is located in the underexposed area should be regarded as a specific example of the present application.
  • the processing device controls the movement of the mobile system to the underexposed area based on the pre-planned route
  • the corresponding cleaning mode can be employed for the moving and cleaning operations.
  • the manner of controlling the behavior of the cleaning robot based on the result of the analysis and according to the preset cleaning mode in step S130 includes any one of the following:
  • FIG. 2 is a schematic diagram showing an embodiment of the cleaning robot of the present application for an underexposed area in an embodiment.
  • the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot according to the original navigation route is shown by the arrow A in the figure. The direction moves and performs a cleaning operation.
  • the processing device of the cleaning robot analyzes that the cleaning robot is located under the under-exposed area, that is, under the bed, according to any of the above methods, modifies the navigation route and controls the mobile system to leave the under-exposed area, in this example.
  • the cleaning robot is moved 180° as indicated by the turning arrow B in the figure and continues to move, so that the cleaning robot leaves the bed and continues to move in the direction indicated by the arrow C and clean other non-underexposure areas.
  • the cleaning robot in FIG. 2 is rotated by 180° as shown by the turning arrow B in the figure, and then the movement is continued as an example.
  • the angle of the deflection is not only 180°. According to actual needs, the angle of the deflection can be flexibly set, and no endless example is given here.
  • FIG. 3 is a schematic diagram showing another embodiment of the cleaning robot for the under-exposed area of the present application.
  • the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot according to the original navigation route is along the arrow A in the figure. The direction of the movement is moved and the cleaning operation is performed.
  • the cleaning robot is located under the under-exposed area, that is, under the bed, and on the one hand, the mobile system is controlled according to the original route along the arrow B in the figure.
  • the direction is moved, and the cleaning system is also controlled not to clean the under-exposed area currently located until the cleaning robot is detected to be in the non-under-exposure area, and the mobile system is continuously controlled to follow the original route in the direction indicated by the arrow C in the figure.
  • the operation is moved and the cleaning system is controlled to perform a cleaning operation in accordance with a cleaning mode corresponding to the non-underexposure cleaning area.
  • the preset cleaning mode of the corresponding underexposed area is focused cleaning.
  • the cleaning robot moves and performs the cleaning operation according to the original navigation route in the direction indicated by the arrow A in the figure.
  • the cleaning device of the cleaning robot analyzes that the cleaning robot is located under the under-exposure zone, that is, under the bed, according to any of the above manners, on the one hand, the mobile system is controlled to move in the direction indicated by the arrow B in the figure according to the original route, and also controls the cleaning.
  • the system performs a cleaning operation corresponding to the under-exposed area where the key cleaning is located until the cleaning robot is detected to be in the non-under-exposure area, and continues to control the movement system to move according to the original route in the direction indicated by the arrow C in the figure and control the cleaning system. Perform a regular cleaning operation in accordance with the cleaning mode corresponding to the non-underexposure cleaning area.
  • the processing device controls the cleaning robot to control the cleaning system and the mobile system using the cleaning mode that is not cleaned based only on the grayscale characteristics or the photosensitive information of the photosensitive member. It is not accurate.
  • map data and landmark information are stored in the storage device, wherein the landmark information includes not only the foregoing attribute information, but also object information represented by the positioning feature, and corresponding to the underexposed area. Attributes, etc.
  • the control method further includes the step of constructing landmark information of the underexposed area in the pre-built map data in a navigation operation environment.
  • the processing device determines that the cleaning robot moves from the non-underexposed region to the underexposed region, or moves from the underexposed region to the non-underexposed region
  • the attribute information is added to the pre-built map data. For example, the processing device determines a boundary position of the underexposed region and the non-underexposed region according to the real-time positioning; then, the processing device increases an attribute including the underexposed region at a boundary position in the map data, or a landmark existing existing near the boundary position The attribute of the underexposed area is added to the information.
  • the indoor display and the intake of the light source may change.
  • the control method further comprises the step of updating the landmark information of the corresponding underexposed area in the pre-built map data.
  • the processing device updates the corresponding landmark information when the cleaning robot is in the underexposed area and the corresponding landmark information is an attribute of the non-underexposed area after performing the underexposure analysis by the image or the photosensitive device, or the processing device passes the image or the light sensitive image. After the device performs the underexposure analysis and determines that the cleaning robot is in the non-underexposed area and the corresponding landmark information is an attribute of the underexposed area, the corresponding landmark information is updated.
  • the step of controlling the behavior of the cleaning robot according to the preset cleaning mode includes: based on the preset cleaning mode, the positioning information identified in the at least one image, and the landmark information, on the basis of the above-mentioned landmark information. Navigate the cleaning robot.
  • the landmark information includes an attribute of an underexposed area or an attribute of a non-underexposed area.
  • the processing device constructs a cleaning mode that does not include an underexposed area according to a predetermined cleaning mode in which the underexposed area is not cleaned, the non-underexposure area is cleaned, and the landmark information in the map data including the underexposed area attribute.
  • the processing device constructs a clean cleaning mode according to a preset under-exposure region lag key cleaning, a cleaning mode in which the non-underexposure region is cleaned, and landmark information in the map data including each attribute of the underexposed region.
  • the trailing end of the route moves to the beginning of the third segment of the route, and the cleaning system is controlled not to perform the cleaning operation during the movement along the second segment; then, the processing device moves along the third segment in accordance with the preset focus cleaning mode Control the cleaning system to perform the appropriate cleaning operations during the period.
  • the current position and navigation of the cleaning robot may be located using only the movement data provided by the movement sensor in the cleaning robot.
  • the processing device controls the behavior of the cleaning robot according to the cleaning mode of the corresponding room. For example, when the processing device moves to the unlit room entrance according to the landmark information, it is located in the underexposed area based on image analysis, and the processing device can identify the positioning feature from the cached image before entering the room, and from Obtaining a gate attribute in the landmark information including the identified positioning feature, thereby determining that the underexposed area is a room requiring cleaning, and therefore, the processing device calculates the moved distance and the current position by using the movement data provided by the movement sensor in the cleaning robot, and The mobile system in the cleaning robot is controlled to move in the room according to the original navigation route, and the processing device also controls the cleaning system of the cleaning robot to clean the floor and the wall of the room.
  • control method further includes the step of outputting prompt information after entering the underexposed area.
  • the processing device may alert the user that the cleaning robot has entered the underexposed area by issuing an audible alarm or by sending a message to the user's mobile terminal after entering the underexposed area.
  • FIG. 5 is a schematic structural view of the cleaning robot of the present application in an embodiment.
  • the cleaning robot includes a control system 21, an imaging device 22, a movement system 23, and a cleaning system 24.
  • Camera device 22, mobile system 23, and cleaning system 24 are all coupled to control system 21.
  • the control system 21 is configured to output a control instruction corresponding to a preset cleaning mode based on an analysis result of the captured image
  • the imaging device 22 is configured to capture an image for processing by the control system
  • the mobile system 23 is configured to be based on the control instruction
  • the cleaning robot is driven to move, and the cleaning system 24 is for performing a cleaning operation based on the control command.
  • the control command includes, but is not limited to, a moving direction, a moving speed, a moving distance, and the like determined based on the navigation route and the current position, a cleaning operation performed according to a preset cleaning mode, and the like.
  • the camera device 22 includes, but is not limited to, a camera, a video camera, a camera module integrated with an optical system or a CCD chip, a camera module integrated with an optical system and a CMOS chip, and the like.
  • the power supply system of the camera device can be controlled by the power supply system of the cleaning robot, and the camera device starts capturing images during the power-on movement of the robot.
  • the camera device may be provided on the body of the cleaning robot.
  • the camera device may be disposed at a middle or edge of the top cover of the cleaning robot, or the camera device may be disposed below the plane of the top surface of the cleaning robot, near the geometric center of the body or near the edge of the body.
  • the camera device can be located on the top surface of the cleaning robot and the field of view optical axis of the camera device is ⁇ 30° with respect to the vertical.
  • the camera device is located at an intermediate position or edge of the top surface of the cleaning robot, and its optical axis is at an angle of -30°, -29°, -28°, -27°...-1° with respect to the vertical line. 0°, 1°, 2°...29°, or 30°.
  • the angle between the optical axis and the vertical line is only an example, and is not limited to the range of the angle accuracy of 1°. According to the design requirements of the actual robot, the angle is included. The accuracy can be higher, such as 0.1 °, 0.01 °, etc., and no endless examples are given here.
  • the camera device includes an adjustable aperture that is controlled by the control system for the control system to acquire an image with increased amount of incoming light.
  • the processing device can change the intensity of the light acquired by the camera by increasing the amount of incoming light.
  • the processing device analyzes the image captured by the adjusted imaging device.
  • the aperture is usually disposed in the imaging device for adjusting the amount of light entering the imaging device. For example, when the imaging device is located in the underexposed area, the amount of light entering can be automatically increased by increasing the aperture. Then, the processing device performs underexposure analysis on the image taken by the adjusted imaging device to improve the accuracy of the analysis result.
  • the mobile system 23 is moved under the control of the control system 21, which is located at the bottom of the cleaning robot.
  • the mobile system 23 includes a drive control device and at least two roller sets.
  • the at least one of the at least two roller groups is a controlled roller group.
  • the drive control device is coupled to the control system, and the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the drive control device includes a drive motor.
  • the drive motor is coupled to the roller set for direct drive roller set rolling.
  • the drive control device may also include one or more processors (such as a CPU or a Micro Processing Unit (MCU)) dedicated to controlling the drive motor.
  • the micro processing unit is configured to convert a control command output by the control system into an electric signal for controlling a driving motor, and control a rotation speed, a steering, and the like of the driving motor according to the electric signal to drive the cleaning robot to move.
  • the processor in the drive control device may be shared with a processor in the control system or may be independently set.
  • the processor in the drive control device functions as a slave processing device, and the processor in the control system functions as a master device, and the drive control device performs motion control based on control of the control system.
  • the processor in the drive control device is shared with a processor in the control system.
  • the drive control device receives the control command output by the control system through the program interface.
  • the drive control device drives the controlled wheel set to scroll based on a control command output by the control system.
  • the cleaning system 24 is subjected to a cleaning operation under the control of the control system 21.
  • the cleaning system 24 includes a cleaning assembly and a cleaning drive control assembly.
  • the cleaning drive control assembly is coupled to the control system, the cleaning drive control assembly driving the cleaning assembly to clean the ground based on a control command output by the control system.
  • the cleaning assembly may include a roller brush assembly, a filter screen, a scrubbing assembly, a suction duct, a dust box (or a garbage box), a suction motor, and the like.
  • the roller brush assembly and the scrubbing assembly may be configured or configured in accordance with the actual design of the cleaning robot.
  • the roller brush assembly includes, but is not limited to, a side brush, a side brush driver, a roller, a roller driver, and the like.
  • the scrubbing assembly includes, but is not limited to, a water holding container, a wiping cloth, an assembly structure of the cloth, a driver of the assembled structure, and the like.
  • the cleaning drive control assembly can include one or more processors (such as a CPU or micro processing unit (MCU)) dedicated to controlling the cleaning assembly.
  • the processor in the cleaning drive control assembly can be shared with the processor in the control system or can be independently set.
  • the processor in the cleaning drive control assembly functions as a slave processing device, the processor in the control system as a master device, and the cleaning drive control component performs a cleaning operation based on a control command output by the control system.
  • the processor in the cleaning drive control assembly is shared with the processor in the control system.
  • the control system can be controlled as shown in FIG. 1 and combined with the foregoing description corresponding to FIG. 1, and will not be described in detail herein.
  • the storage device 211 shown in FIG. 5 may correspond to the storage device 11 described in FIG. 1;
  • the processing device 213 shown in FIG. 5 may correspond to the processing device 13 described in FIG. 1.
  • the control system 21 shown in FIG. 5 includes a storage device 211 and a processing device 213, which are connected to the mobile system 23, the cleaning system 24, and the imaging device 22 as an example, and the control system 21 for the cleaning robot is used for The operation process of outputting a control instruction corresponding to the preset cleaning mode based on the analysis result of the captured image is described:
  • the storage device 211 stores one or more programs.
  • the program includes a corresponding program described later by the processing device 213 to perform the steps of control, analysis, and the like.
  • the processing device 213 performs control processing by calling a program stored in the storage device 211.
  • the processing device controls the camera to capture an image in real time in a cleaning robot navigation operating environment.
  • the camera device may be a camera for taking still images or videos.
  • the camera device can also include an adjustable aperture.
  • the processing device can increase the amount of incoming light by adjusting the aperture to change the intensity of the light acquired by the imaging device.
  • the processing device controls the imaging device to capture an image, and in subsequent processing, the processing device analyzes the image captured by the adjusted imaging device.
  • the processing device then analyzes the captured at least one image.
  • the processing device may analyze at least one of the acquired still images.
  • the processing device may continuously or discontinuously acquire image frames in the acquired video, and then select one frame of image as one image for analysis.
  • the processing device can also analyze a plurality of images taken within a preset duration.
  • the preset duration can be known according to technical experience or experimental design and stored in the cleaning robot in advance. Generally, the preset duration can be on the order of milliseconds.
  • the processing device can determine the current position of the cleaning robot using the locating features identified in the at least one image.
  • the processing device can position the cleaning robot with positioning features identified by the at least one image and pre-established landmark information.
  • the landmark information may be an attribute information collected during previous navigation and corresponding to the anchor point in the map data, including but not limited to: the positioning that the camera device can capture at a certain positioning point of the map data.
  • the feature information, the map data in the physical space when the positioning feature is photographed, the position in the corresponding image frame when the positioning feature is photographed in the past, and the position and posture of the cleaning robot when the corresponding positioning feature is photographed are attribute information.
  • the landmark information may be stored with the map data in the storage device.
  • the map data may be pre-built based on SLAM (Simultaneous Localization and Mapping) or VSLAM technology.
  • the processing device analyzes whether the cleaning robot is in an underexposed region using grayscale features in at least one of the images. In another embodiment, the processing device may also determine whether the cleaning robot is located in the underexposed region by analyzing the photosensitive information from the photosensitive member.
  • the processing device controls the behavior of the cleaning robot based on the result of the analysis and according to a preset cleaning mode; wherein the cleaning mode includes a cleaning mode corresponding to the underexposed region.
  • the processing device in the cleaning robot can determine a cleaning mode for the corresponding underexposed region based on user input. Further, the manner in which the processing device controls the behavior of the cleaning robot based on the result of the analysis and in accordance with a preset cleaning mode includes any one of the following: 1) adjusting a navigation route of the cleaning robot to leave the underexposed region. Please refer to FIG. 2.
  • FIG. 2 is a schematic diagram showing an embodiment of the cleaning robot of the present application for an underexposed area in an embodiment. As shown in the figure, in this example, the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot When the cleaning robot is working, the cleaning robot according to the original navigation route is shown by the arrow A in the figure.
  • the direction moves and performs a cleaning operation.
  • the processing device of the cleaning robot analyzes that the cleaning robot is located under the under-exposed area, that is, under the bed, according to any of the above methods, the processing device modifies the navigation route and controls the mobile system to leave the under-exposed area, in this example.
  • the cleaning robot is deflected by 180° as indicated by the turning arrow B in the figure, and then moved to make the cleaning robot leave the bed and continue to move in the direction indicated by the arrow C and clean other non-underexposure areas.
  • FIG. 3 is a schematic diagram showing another embodiment of the cleaning robot for the under-exposed area of the present application.
  • the underexposed area refers to the area under the bed, and the preset cleaning mode of the corresponding underexposed area is not cleaned.
  • the cleaning robot When the cleaning robot is working, the cleaning robot according to the original navigation route is along the arrow A in the figure. The direction of the movement is moved and the cleaning operation is performed.
  • the cleaning device of the cleaning robot is analyzed according to any of the above methods, the cleaning robot is located under the under-exposed area, that is, under the bed, and on the one hand, the mobile system is controlled according to the original route along the arrow B in the figure.
  • the direction is moved, and the cleaning system is also controlled not to clean the under-exposed area currently located until the cleaning robot is detected to be in the non-under-exposure area, and the mobile system is continuously controlled to follow the original route in the direction indicated by the arrow C in the figure.
  • the operation is moved and the cleaning system is controlled to perform a cleaning operation in accordance with a cleaning mode corresponding to the non-underexposure cleaning area.
  • the preset cleaning mode of the corresponding underexposed area is focused cleaning.
  • the cleaning robot moves and performs the cleaning operation according to the original navigation route in the direction indicated by the arrow A in the figure.
  • the mobile system is controlled to move in the direction indicated by the arrow B in the figure according to the original route, and also controls the cleaning.
  • the system performs a cleaning operation corresponding to the under-exposed area where the key cleaning is located until the cleaning robot is detected to be in the non-under-exposure area, and continues to control the movement system to move according to the original route in the direction indicated by the arrow C in the figure and control the cleaning system. Perform a regular cleaning operation in accordance with the cleaning mode corresponding to the non-underexposure cleaning area.
  • the processing device may also alert the user that the cleaning robot has entered the underexposed area by issuing an audible alarm or by sending a message to the user's mobile terminal after entering the underexposed area.
  • the processing device controls the cleaning robot to adopt a cleaning mode that is not cleaned only according to the gradation characteristics or the photosensitive information of the photosensitive member. It is not accurate to control the cleaning system and the mobile system.
  • the storage device stores map data and landmark information, wherein the landmark information includes not only the foregoing attribute information, but also object information represented by the positioning feature, and corresponding underexposed regions. Attributes, etc.
  • the processing device also performs the step of constructing landmark information of the underexposed area in the pre-built map data in the navigation operating environment. When the processing device determines that the cleaning robot moves from the non-underexposed region to the underexposed region, or moves from the underexposed region to the non-underexposed region, the attribute information is added to the pre-built map data.
  • the processing device further performs the step of updating the landmark information of the corresponding underexposed area in the pre-built map data. For example, the processing device updates the corresponding landmark information when the cleaning robot is in the underexposed area and the corresponding landmark information is an attribute of the non-underexposed area after performing the underexposure analysis by the image or the photosensitive device, or the processing device passes the image or the light sensitive image. After the device performs the underexposure analysis and determines that the cleaning robot is in the non-underexposed area and the corresponding landmark information is an attribute of the underexposed area, the corresponding landmark information is updated.
  • the processing device navigates the cleaning robot based on a preset cleaning mode, positioning information identified in at least one image, and the landmark information.
  • the landmark information includes an attribute of an underexposed area or an attribute of a non-underexposed area.
  • the processing device constructs a cleaning mode that does not include an underexposed area according to a predetermined cleaning mode in which the underexposed area is not cleaned, the non-underexposure area is cleaned, and the landmark information in the map data including the underexposed area attribute. Cleaning the route; determining the current position of the cleaning robot at least one image taken along the cleaning route, and controlling the movement system to move and clean along the cleaning route, thereby not cleaning the underexposed area.
  • the processing device constructs a clean cleaning mode according to a preset under-exposure region lag key cleaning, a cleaning mode in which the non-underexposure region is cleaned, and landmark information in the map data including each attribute of the underexposed region.
  • the trailing end of the route moves to the beginning of the third segment of the route, and the cleaning system is controlled not to perform the cleaning operation during the movement along the second segment; then, the processing device moves along the third segment in accordance with the preset focus cleaning mode Control the cleaning system to perform the appropriate cleaning operations during the period.
  • the current position and navigation of the cleaning robot may be located using only the movement data provided by the movement sensor in the cleaning robot.
  • the processing device controls the behavior of the cleaning robot according to the cleaning mode of the corresponding room. For example, when the processing device moves to the unlit room entrance according to the landmark information, it is located in the underexposed area based on image analysis, and the processing device can identify the positioning feature from the cached image before entering the room, and from Obtaining a gate attribute in the landmark information including the identified positioning feature, thereby determining that the underexposed area is a room requiring cleaning, and therefore, the processing device calculates the moved distance and the current position by using the movement data provided by the movement sensor in the cleaning robot, and The mobile system in the cleaning robot is controlled to move in the room according to the original navigation route, and the processing device also controls the cleaning system of the cleaning robot to clean the floor and the wall of the room.
  • the present application also provides a storage medium of a computer device, the storage medium storing at least one program, the program being executed by the processor to perform the control method of any of the foregoing.
  • portions of the technical solution of the present application that contribute in essence or to the prior art may be embodied in the form of a software product, which may include one or more of the executable instructions for storing the machine thereon.
  • a machine-readable medium that, when executed by one or more machines, such as a computer, computer network, or other electronic device, can cause the one or more machines to perform operations in accordance with embodiments of the present application. For example, each step in the positioning method of the robot is executed.
  • the machine-readable medium can include, but is not limited to, a floppy disk, an optical disk, a CD-ROM (Compact Disk-Read Only Memory), a magneto-optical disk, a ROM (Read Only Memory), a RAM (Random Access Memory), an EPROM (erasable) In addition to programmable read only memory, EEPROM (Electrically Erasable Programmable Read Only Memory), magnetic or optical cards, flash memory, or other types of media/machine readable media suitable for storing machine executable instructions.
  • the storage medium may be located in a robot or in a third-party server, such as in a server that provides an application store. There are no restrictions on the specific application mall, such as Huawei Application Mall, Apple App Store, etc.
  • This application can be used in a variety of general purpose or special purpose computing system environments or configurations.
  • the application can be described in the general context of computer-executable instructions executed by a computer, such as a program module.
  • program modules include routines, programs, objects, components, data structures, and the like that perform particular tasks or implement particular abstract data types.
  • the present application can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are connected through a communication network.
  • program modules can be located in both local and remote computer storage media including storage devices.

Landscapes

  • Electric Vacuum Cleaner (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种控制方法、***及所适用的清洁机器人。控制方法包括以下步骤:在清洁机器人导航操作环境下,控制摄像装置(22)实时拍摄图像;对所拍摄的至少一幅图像进行分析;基于分析的结果并按照预设的清洁模式控制清洁机器人的行为;其中,清洁模式包括对应欠曝区域的清洁模式。通过采用对所拍摄的图像进行分析,基于分析的结果并按照预设的清洁模式来控制清洁机器人的行为,使得对于某些特定区域,用户可以根据需求进行清洁,实现根据用户需求划区域进行清洁的目的。

Description

控制方法、***及所适用的清洁机器人 技术领域
本申请涉及智能机器人领域,特别是涉及一种控制方法、***及所适用的清洁机器人。
背景技术
随着科技的不断发展,智能机器人逐渐走入人们的生活,其中,清洁机器人由于其无需人工参与即可自动清洁房屋而逐渐得到广泛应用。此外,随着机器人的移动技术的更新迭代,VSLAM(Visual Simultaneous Localization and Mapping,基于视觉信息的即时定位与地图构建,简称VSLAM)技术为机器人提供了更精准的导航能力,使得机器人能更有效地自主移动。
目前,清洁机器人在进行清洁作业时通常是使机器人在以预定移动模式移动的同时重复执行清洁。然而,针对清洁机器人移动到例如床下、沙发下等碎屑集中区域的情况,由于所述区域积尘过多,用户有时会不想对该区域进行清洁,或者用户希望在清洁完其他区域后再对该区域进行清洁。
发明内容
鉴于以上所述现有技术的缺点,本申请的目的在于提供一种控制方法、***及所适用的清洁机器人,用于解决现有技术中无法根据用户需求划区域进行清洁的问题。
为实现上述目的及其他相关目的,本申请的第一方面提供一种清洁机器人的控制方法,其中,所述清洁机器人包含摄像装置、移动***和清洁***,所述控制方法包括以下步骤:在清洁机器人导航操作环境下,控制所述摄像装置实时拍摄图像;对所拍摄的至少一幅图像进行分析;基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
本申请的第二方面还提供一种清洁机器人的控制***,其中所述清洁机器人包含摄像装置、移动***和清洁***,包括:存储装置,存储有一个或多个程序;处理装置,与所述存储装置相连,通过调取所述一个或多个程序以执行以下步骤:在清洁机器人导航操作环境下,控制所述摄像装置实时拍摄图像;对所拍摄的至少一幅图像进行分析;基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
本申请的第三方面还提供一种清洁机器人,包括:如前述的任一所述的控制***,用于基于对所拍摄的图像的分析结果输出对应于预设清洁模式的控制指令;摄像装置,与所述控制***相连,用于摄取图像以供所述控制***处理;移动***,与所述控制***相连,用于基于所述控制指令驱动清洁机器人移动;清洁***,与所述控制***相连,用于基于所述控制指令进行清洁操作。
本申请的第四方面还提供一种计算机设备的存储介质,所述存储介质存储有至少一个程序,所述程序被处理器执行时执行前述中任一所述方法。
如上所述,本申请的控制方法、***及所适用的清洁机器人,具有以下有益效果:通过采用对所拍摄的图像进行分析,基于分析的结果并按照预设的清洁模式来控制清洁机器人的行为的技术方案,使得对于某些特定区域,用户可以根据需求进行清洁,实现根据用户需求划区域进行清洁的目的。
附图说明
图1显示为本申请清洁机器人的控制***在一种实施方式中的结构示意图。
图2显示为本申请清洁机器人针对欠曝区域的情况在一种实施方式中的示意图。
图3显示为本申请清洁机器人针对欠曝区域的情况在另一种实施方式中的示意图。
图4显示为本申请清洁机器人的控制方法在一种实施方式中的流程图。
图5显示为本申请的清洁机器人在一种实施方式中的结构示意图。
具体实施方式
以下由特定的具体实施例说明本申请的实施方式,熟悉此技术的人士可由本说明书所揭露的内容轻易地了解本申请的其他优点及功效。
在下述描述中,参考附图,附图描述了本申请的若干实施例。应当理解,还可使用其他实施例,并且可以在不背离本申请的精神和范围的情况下进行机械组成、结构、电气以及操作上的改变。下面的详细描述不应该被认为是限制性的,并且本申请的实施例的范围仅由公布的专利的权利要求书所限定。这里使用的术语仅是为了描述特定实施例,而并非旨在限制本申请。
再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。此处使用的术语“或”和“和/或”被解释为包括性的,或意味着任一个或任何组合。因此,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A、B和C”。仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
清洁机器人是在不需要用户控制的情况下在待清洁区域行进的同时通过从待清洁区域的地面吸入碎屑(例如,灰尘)来自动清洁待清洁区域的装置。清洁机器人基于视觉传感器所提供的视觉信息并结合其他移动传感器所提供的移动数据,一方面能够构建机器人所在场地的地图数据,另一方面,还可基于已构建的地图数据提供路线规划、路线规划调整及导航服务,这使得清洁机器人的移动效率更高。其中,所述视觉传感器举例包括摄像装置,对应的视觉信息为 图像数据(以下简称为图像)。所述移动传感器举例包括速度传感器、里程计传感器、距离传感器、悬崖传感器等。然而,针对清洁机器人移动到例如床下、沙发下等碎屑集中区域的情况,由于所述区域积尘过多使得用户可能不想要对此区域进行清洁,或者用户希望在清洁完其他区域后再对该区域进行清洁。
为了解决上述问题,本申请提供一种清洁机器人的控制***。其中,所述清洁机器人包含摄像装置、移动***和清洁***。
所述摄像装置与所述清洁机器人的控制***相连,所述摄像装置用来摄取图像以供所述控制***处理。所述摄像装置包括但不限于:照相机、视频摄像机、集成有光学***或CCD芯片的摄像模块、集成有光学***和CMOS芯片的摄像模块等。所述摄像装置的供电***可受清洁机器人的供电***控制,当机器人上电移动期间,所述摄像装置即开始拍摄图像。此外,所述摄像装置可以设于清洁机器人的主体上。例如,所述摄像装置可以设于清洁机器人的顶盖的中部或边缘,或者所述摄像装置可以设于清洁机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。在某些实施例中,所述摄像装置可以位于清洁机器人的顶面,并且所述摄像装置的视野光学轴相对于垂线为±30°。例如,所述摄像装置位于清洁机器人顶面的中间位置或边缘,且其光学轴相对于垂线的夹角为-30°、-29°、-28°、-27°……-1°、0°、1°、2°……29°、或30°。需要说明的是,本领域技术人员应该理解,上述光学轴与垂线的夹角仅为举例,而非限制其夹角精度为1°的范围内,根据实际机器人的设计需求,所述夹角的精度可更高,如达到0.1°、0.01°以上等,在此不做无穷尽的举例。
所述移动***与所述清洁机器人的控制***相连,所述移动***基于所述控制***输出的控制指令驱动清洁机器人移动。在一实施例中,所述移动***包括驱动控制装置和至少两个滚轮组。其中,所述至少两个滚轮组中的至少一个滚轮组为受控滚轮组。所述驱动控制装置与所述控制***相连,所述驱动控制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述驱动控制装置包含驱动电机。所述驱动电机与所述滚轮组相连用于直接驱动滚轮组滚动。所述驱动控制装置还可以包含专用于控制驱动电机的一个或多个处理器(如CPU或微处理单元(MCU))。例如,所述微处理单元用于将所述控制***输出的控制指令转化为对驱动电机进行控制的电信号,并根据所述电信号控制所述驱动电机的转速、转向等以驱动清洁机器人移动。所述驱动控制装置中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述驱动控制装置中的处理器作为从处理设备,所述控制***中的处理器作为主设备,驱动控制装置基于控制***的控制进行移动控制。或者所述驱动控制装置中的处理器与所述控制***中的处理器相共用。驱动控制装置通过程序接口接收控制***所输出的控制指令。所述驱动控 制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述清洁***与所述清洁机器人的控制***相连,所述清洁***基于所述控制***输出的控制指令进行清洁操作。在一实施例中,所述清洁***包括清洁组件和清洁驱动控制组件。其中,所述清洁驱动控制组件与所述控制***相连,所述清洁驱动控制组件基于所述控制***输出的控制指令驱动所述清洁组件清洁地面。
所述清洁组件可包括辊刷组件、过滤网、擦洗组件、吸入管道、集尘盒(或垃圾盒)、吸风电机等。所述辊刷组件和擦洗组件可根据清洁机器人的实际设计而择一配置或全部配置。所述辊刷组件包括但不限于:边刷、边刷驱动器、辊轮、辊轮驱动器等。所述擦洗组件包括但不限于:盛水容器、擦拭布、布的装配结构及所述装配结构的驱动器等。
所述清洁驱动控制组件可以包含专用于控制清洁组件的一个或多个处理器(如CPU或微处理单元(MCU))。所述清洁驱动控制组件中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述清洁驱动控制组件中的处理器作为从处理设备,所述控制***中的处理器作为主设备,清洁驱动控制组件基于控制***输出的控制指令进行清洁操作。或者所述清洁驱动控制组件中的处理器与所述控制***中的处理器相共用。
请参阅图1,图1显示为本申请清洁机器人的控制***在一种实施方式中的结构示意图。如图所示,本申请清洁机器人的控制***包括存储装置11以及处理装置13。
所述存储装置11存储有一个或多个程序。所述程序包括稍后描述的由处理装置13调取以执行控制、分析等步骤的相应程序。所述存储装置包括但不限于高速随机存取存储器、非易失性存储器。例如一个或多个磁盘存储设备、闪存设备或其他非易失性固态存储设备。在某些实施例中,存储装置还可以包括远离一个或多个处理器的存储器,例如,经由RF电路或外部端口以及通信网络(未示出)访问的网络附加存储器,其中所述通信网络可以是因特网、一个或多个内部网、局域网(LAN)、广域网(WLAN)、存储局域网(SAN)等,或其适当组合。存储器控制器可控制机器人的诸如CPU和外设接口之类的其他组件对存储装置的访问。
所述处理装置13与存储装置11相连并能够与上述的摄像装置、移动***以及清洁***进行数据通信。处理装置13可包括一个或多个处理器。处理装置13可操作地与存储装置11中的易失性存储器和/或非易失性存储器耦接。处理装置可执行在存储器和/或非易失性存储设备中存储的指令以在机器人中执行操作,诸如对拍摄的图像进行分析并基于分析结果控制清洁机器人的行为等。如此,处理器可包括一个或多个通用微处理器、一个或多个专用处理器(ASIC)、一个或多个数字信号处理器(DSP)、一个或多个现场可编程逻辑阵列(FPGA)、或它们的任何组合。所述处理装置还与I/O端口和输入结构可操作地耦接,该I/O端口可使得机器人能够与各种其他电子设备进行交互,该输入结构可使得用户能够与计算设备进行交互。因此,输入结 构可包括按钮、键盘、鼠标、触控板等。所述其他电子设备可以是所述机器人中移动装置中的移动电机,或机器人中专用于控制移动装置的从处理器,如MCU(Microcontroller Unit,微控制单元,简称MCU)。
在一种示例中,所述处理装置通过数据线分别连接存储装置以及上述的摄像装置、移动***和清洁***。所述处理装置通过数据读写技术与存储装置进行交互,所述处理装置通过接口协议与摄像装置、移动***以及清洁***进行交互。其中,所述数据读写技术包括但不限于:高速/低速数据接口协议、数据库读写操作等。所述接口协议包括但不限于:HDMI接口协议、串行接口协议等。
所述处理装置13通过调取存储装置11中所存储的程序以执行以下步骤:在清洁机器人导航操作环境下,控制所述摄像装置实时拍摄图像;对所拍摄的至少一幅图像进行分析;基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。其中,所述导航操作环境是指机器人依据当前定位及基于当前定位而确定的导航路线进行移动和执行相应操作的环境。具体地,清洁机器人的导航操作环境是指清洁机器人依据导航路线移动并进行清洁操作的环境。
首先,处理装置13在清洁机器人导航操作环境下控制摄像装置实时拍摄图像。例如,摄像装置可以是用来拍摄静态图像或视频的摄像头。在一实施例中,清洁机器人可根据导航操作环境预先设定拍摄图像的时间间隔,然后处理装置控制摄像装置以预设的时间间隔来拍摄图像以获取不同时刻下的静态图像。在另一实施例中,处理装置控制摄像装置拍摄视频。
然后,处理装置13对所拍摄的至少一幅图像进行分析。其中,在摄像装置获取的是静态图像的情况下,处理装置可以对所获取的静态图像中的至少一幅图像进行分析。在摄像装置获取的是视频的情况下,由于视频是由图像帧构成的,因此处理装置首先可以连续或不连续地采集所获取的视频中的图像帧,然后选用一帧图像作为一幅图像进行分析。其中,处理装置可以对一幅或更多幅图像进行分析。
接着,处理装置13基于分析的结果并按照预设的清洁模式控制清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。其中,所述分析的结果包括清洁机器人位于欠曝区域,或者机器人未位于欠曝区域。所述欠曝区域是物体遮住光源(太阳或聚光灯等)所发射的光线使得光线不能穿过不透明物体而形成的较暗区域如床下、沙发下等。另外,所述欠曝区域还可以指由于清洁机器人在位于光强过弱的一个区域时,导致进入其摄像装置的进光量过少而造成拍摄图像中包含亮度低于预设亮度阈值的情况,即导致图像欠曝。其中,所述拍摄图像中包含的亮度可由图像灰度值来描述,例如,所述处理装置检测到图像中包含灰度值小于预设灰度阈值的区域时将所述图像确定为欠曝图像,以此确定所述清洁机器人位于欠曝区域。在某 些实施例中,还可以将所述亮度由摄像装置中光照感应器所提供的光强数值来描述,例如,所述处理装置获取图像以及对应的光强数据,当光强数据小于预设光强阈值时,所述处理装置确定清洁机器人位于欠曝区域。再或者,在某些实施例中,所述处理装置根据图像中的灰度值和光强数据来确定清洁机器人是否位于欠曝区域。例如,所述处理装置以同时满足上述两示例中的两种条件确定清洁机器人位于欠曝区域。
所述预设的清洁模式可以是预先设置的,并且由清洁机器人中的存储装置存储。所述清洁模式还可以通过预先对机器人的清洁区域进行限定而得到。在一实施例中,用户可以根据不同分类方式设置清洁区域。例如,根据欠曝情况,处理装置可以设置清洁区域包括欠曝区域、未欠曝区域。又如,用户根据房间规划,可以为清洁机器人设置不同分类的清洁区域,包括房间区域、客厅区域、厨房区域等,不同分类之间的区域可以存在交叠,例如,客厅区域中床下、沙发下等可以属于欠曝区域,而其他区域可以属于非欠曝区域。借助已设定的各清洁区域,用户可以向清洁机器人输入对应各清洁区域的清洁模式。其中,所述清洁模式包括对应欠曝区域的清洁模式。例如,对应欠曝区域的清洁模式可以设置为在清洁机器人位于欠曝区域的情况下停止清洁,或者在清洁机器人位于欠曝区域的情况下滞后清洁,或者在清洁机器人位于欠曝区域的情况下着重清洁,或者在清洁机器人位于欠曝区域的情况下持续清洁等。所述清洁方式可以包括清扫、吸尘、拖擦或其任意组合。所述滞后清洁可以是例如在清洁机器人位于欠曝区域的情况下先不进行清洁转而继续清洁其他区域,待其他区域清洁完成后再返回清洁欠曝区域。所述着重清洁可以是通过调整清洁组件的清洁模式例如增大辊刷组件转速、通过液体施加器喷洒液体、增大擦洗组件压力以及增大真空气道的抽吸力等等至少一种手段来加强清洁效果。所述清洁模式还包括其他清洁模式,例如,在清洁机器人未位于欠曝区域的情况下持续清洁。
所述清洁机器人中的处理装置可基于用户输入而确定对应欠曝区域的清洁模式。在此,清洁机器人还可以包含人机交互装置,其亦与处理装置相连。用户可直接在清洁机器人所提供的人机交互装置上输入对应各欠曝区域的清洁模式。或者,清洁机器人包含与处理装置相连的网络装置,用户的其他智能终端(如手机、平板电脑、个人电脑等)可通过所述网络装置与处理装置进行数据传输,用户通过操作其他智能终端将所输入的对应各欠曝区域的清洁模式传递至处理装置,并由处理装置将相应的对应关系存储到清洁机器人的存储装置中。其中,所述欠曝区域可以是预先标定的;或经由清洁机器人通过图像分析得到的;或者而结合预先标定和图像分析所确定的。每个欠曝区域可对应统一的清洁模式,或各自单独对应清洁模式,再或者按照欠曝区域的分类对应清洁模式等。
所述清洁机器人的行为可以包括在清洁机器人位于欠曝区域时沿原导航路线移动且持续清洁、沿原导航路线移动且停止清洁、沿原导航路线移动且着重清洁、沿原导航路线移动且滞后 清洁、修改导航路线且持续清洁等。所述清洁机器人的行为还可以包括在清洁机器人未位于欠曝区域时沿原导航路线持续清洁等。
需要说明的是,上述清洁模式以及清洁机器人的行为仅为举例,而非对本申请清洁模式和清洁机器人的行为的方式的限制。事实上,技术人员可以根据清洁机器人类型、用户需求等设置其他方式的清洁模式以及清洁机器人的行为,在此不再一一描述。
本申请清洁机器人的控制***,通过采用处理装置对摄像装置所拍摄的图像进行分析,基于分析的结果并按照预设的清洁模式来控制清洁机器人的行为的技术方案,使得对于某些特定区域,用户可以根据需求进行清洁,实现根据用户需求划区域进行清洁的目的。
为了能够准确定位清洁机器人的当前位置,所述处理装置执行对所拍摄的至少一幅图像进行分析的步骤包括:利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置。其中,所述定位特征包括但不限于:形状特征、灰度特征等。所述形状特征包括但不限于:角点特征、直线特征、边缘特征、曲线特征等。所述灰度特征包括但不限于:灰度跳变特征、高于或低于灰度阈值的灰度值、图像帧中包含预设灰度范围的区域尺寸等。此外,为了清洁机器人能够基于图像获取到足够多的定位特征,所述图像中所能识别的定位特征数量通常为多个,例如10个以上。
在一实施例中,针对处理装置利用在至少一幅图像所识别的定位特征确定清洁机器人的当前位置的实现方式,例如,通过对所拍摄图像中实物的图形进行识别并与标准件的图形进行匹配、以及基于所述标准件的标准物理特征来确定机器人在当前物理空间中的定位信息。又如,通过对图像中所识别的特征与预设的地图数据中的地标信息中的特征的匹配来确定机器人在当前物理空间中的定位信息。在另一实施例中,针对处理装置利用在至少两幅图像所识别的定位特征确定清洁机器人的当前位置的实现方式,例如,利用两图像帧中所匹配的特征点的位置偏移信息来确定机器人的位置及姿态。在再一实施例中,处理装置利用在至少一幅图像所识别的定位特征及预先建立的地标信息对所述清洁机器人进行定位。其中,所述地标信息可以是在历次导航期间收集的并对应于地图数据中定位点的一种属性信息,其包括但不限于:在地图数据的某个定位点上摄像装置所能摄取的定位特征、历次拍摄到所述定位特征时在物理空间的地图数据、历次拍摄到所述定位特征时在相应图像帧中的位置、拍摄相应定位特征时清洁机器人的位置及姿态等属性信息。所述地标信息可与地图数据一并保存在所述存储装置。所述地图数据可以是基于SLAM(Simultaneous Localization and Mapping,即时定位与地图构建)或VSLAM技术预先构建而得到的。
另外,所述处理装置在利用所拍摄的至少一幅图像进行定位分析时,不必然有时序限制地,还执行利用至少一幅图像中的灰度特征分析所述清洁机器人是否位于欠曝区域的步骤。在 实际应用中,当清洁机器人位于欠曝区域时,通常清洁机器人位于床下、沙发下等位置。针对清洁机器人位于床下、沙发下的情况,由于床下、沙发下通常积尘过多,所以用户可能不想对该区域进行清洁或者用户希望在清洁完其他区域后再对该区域进行清洁。
为此,处理装置对机器人是否位于欠曝区域进行分析。在此,处理装置通过获取摄像装置所拍摄的图像进行欠曝区域分析。其中,通常摄像装置所拍摄的图像为RGB颜色模式,因此处理装置需先对所拍摄的图像进行灰度化处理得到灰度图像,然后再对灰度图像进行欠曝分析以确定清洁机器人是否位于欠曝区域。在此,处理装置可采用分量法、最大值法、平均值法或者加权平均法等对所拍摄的图像进行灰度化处理获得灰度图像。灰度图像是一种具有从黑到白256级灰度色阶或等级的单色图像,255表示白色,0表示黑色。
在一实施例中,处理装置通过分析灰度图像中的灰度分布、灰度均值、灰度极大极小值等来确定清洁机器人是否位于欠曝区域。在一示例中,在选用灰度分布来表征灰度特征的情况下,当处理装置对灰度图像中灰度分布进行分析后得到灰度分布集中在预设的灰度欠曝区间时,处理装置确定清洁机器人位于欠曝区域。例如,灰度欠曝区间可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。
在一具体示例中,摄像装置被选用为包括可调光圈的摄像装置。处理装置可以通过增大进光量来改变摄像装置所获取的光线的强度。在对进光量进行补偿之后,处理装置对调整后的摄像装置所拍摄的图像进行分析。其中,光圈通常设置在摄像装置内,用于调节进入摄像装置中光线的多少。例如,当摄像装置位于欠曝区域时可自动通过增大所述光圈来增加进光量,然后,处理装置对调整后摄像装置所摄取的至少一幅图像进行上述欠曝分析,以提高分析结果的准确性。上述处理装置分析所拍摄的图像的方式虽然能够及时响应对应欠曝区域的清洁模式,但是在一些实际应用中,由于这种摄像装置在不同光强下会自动调整光圈,这在光强变化时可能会因为光圈调整不及时而导致所拍摄的图像灰度普遍较低的情况。为了防止处理装置对单一图像的分析不准确的问题,可通过对多幅图像进行欠曝分析,以提高欠曝分析的准确性。
此外,在另一些实际应用中,还存在摄像装置在短时间段内被物体遮挡的情况。在摄像装置被遮挡的情况下,为防止遮挡对处理装置分析的结果的干扰,处理装置可以对预设时长内所拍摄的多幅图像进行分析以确定所述清洁机器人是否位于欠曝区域。其中,预设时长可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。一般地,预设时长可以在毫秒级。例如,所述处理装置在检测到一幅图像欠曝时,继续检测后续摄像装置所摄取的至少一幅图像也呈现欠曝特性,进而确定机器人位于欠曝区域。
在另一种实施方式中,处理装置还可以通过对来自感光元件的感光信息进行分析以确定所述清洁机器人是否位于欠曝区域。其中,所述感光元件包括但不限于:光敏电阻、光敏三极管、 光电倍增管、CCD元件、CMOS器件等。此外,所述感光元件可以设于清洁机器人的主体上。例如,所述感光元件可以设于清洁机器人的顶盖的沿行进方向的前边缘处,或者可以在清洁机器人的顶盖的边缘处间隔设置多个感光元件,以使得当清洁机器人的一部分进入欠曝区域时即可确定清洁机器人位于欠曝区域。在某些实施例中,首先,感光元件将所感应的光强转化为光强数据输出给处理装置,然后,处理装置对所述光强数据进行分析以确定清洁机器人是否位于欠曝区域。例如,处理装置预先设置欠曝阈值,当所得到的光强数据低于所述欠曝阈值时,处理装置确定清洁机器人位于欠曝区域;反之,则确定清洁机器人未位于欠曝区域。其中,所述欠曝阈值可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。
需要说明的是,基于上述一种或多种分析方式而确定清洁机器人是否位于欠曝区域仅为举例,而非对本申请确定位于欠曝区域的方式的限制。事实上,技术人员还可以利用多种灰度分析方法并结合感光元件方法所得到结果进一步对清洁机器人是否位于欠曝区域进行评价等方式来最终确定清洁机器人是否位于欠曝区域。在此不再一一描述。然而,基于本申请所提及的任一种图像灰度值而进行分析,或在此基础上改进以确定的清洁机器人位于欠曝区域的方式应视为本申请的一种具体示例。
所述处理装置接着执行基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为的步骤。其中,所述清洁模式包括对应欠曝区域的清洁模式。
在此,当处理装置基于预先规划的路线控制移动***移动至欠曝区域时,可采用对应的清洁模式进行移动和清洁操作。其中,所述处理装置基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为的方式包括以下任一种:
1)调整所述清洁机器人的导航路线以离开所述欠曝区域。请参阅图2,图2显示为本申请清洁机器人针对欠曝区域的情况在一种实施方式中的示意图。如图所示,本示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,处理装置修改导航路线并控制移动***离开欠曝区域,本示例中,清洁机器人沿图中转向箭头B所示偏转180°后继续移动,以使得清洁机器人离开床下并继续沿箭头C所示的方向移动且清洁其他非欠曝区域。需要说明的是,本领域技术人员应该理解,图2中的清洁机器人沿图中转向箭头B所示偏转180°后继续移动仅为举例,所述偏转的角度并非仅为180°这一种情况,根据实际需求,所述偏转的角度的可灵活设置多个,在此不做无穷尽的举例。
2)按照原导航路线控制所述清洁机器人经过所述欠曝区域,以及按照预设的对应欠曝区域的清洁模式控制所述清洁机器人清洁所经过的欠曝区域。请参阅图3,图3显示为本申请清洁机 器人针对欠曝区域的情况在另一种实施方式中的示意图。如图所示,在一示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***不予清洁当前所位于的欠曝区域,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行清洁操作。又如,在另一示例中,预设的对应欠曝区域的清洁模式为重点清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***执行对应重点清洁所位于的欠曝区域的清洁操作,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行常规的清洁操作。
在如未开灯的房间等需要清洁且光源未照射的空间情形下,处理装置仅根据灰度特征或感光元件的感光信息来控制清洁机器人采用不予清洁的清洁模式来控制清洁***和移动***是不准确的。为了综合适配各种实际应用场景,所述存储装置中保存有地图数据和地标信息,其中,所述地标信息不仅包含上述各属性信息,还包括定位特征所表征的物体信息、对应欠曝区域的属性等。所述处理装置还执行在导航操作环境下还在预先构建的地图数据中构建欠曝区域的地标信息的步骤。当处理装置确定清洁机器人由非欠曝区域移动至欠曝区域,或由欠曝区域移动至非欠曝区域时,在预先构建的地图数据中增加属性信息。例如,处理装置根据实时定位而确定欠曝区域和非欠曝区域的边界位置;接着,所述处理装置在地图数据中边界位置增加包含欠曝区域的属性,或者在边界位置附近已存在的地标信息中增加欠曝区域的属性。
在实际应用中,由于室内的摆设、光源的摄取都可能发生变化,例如,先前未放置沙发使得当前位置被收录为地标信息的特征为未欠曝,但当沙发被搬移到此后使得清洁机器人在移动至该位置时分析出了位于欠曝区域。再如,阳光射入房间时,相应房间不属于欠曝区域,然而当由白天转为夜晚且未开灯照明情况下,该房间为欠曝区域。为此,所述处理装置还执行在预先构建的地图数据中更新相应欠曝区域的地标信息的步骤。例如,处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于欠曝区域且相应的地标信息为非欠曝区域的属性时,更新相应的地标信息;或者处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于非欠曝区域且相应的地标信息为欠曝区域的属性时,更新相应的地标信息。
在包含了上述地标信息的基础上,所述处理装置基于预设的清洁模式、在至少一幅图像中所识别的定位信息及所述地标信息导航所述清洁机器人。其中,所述地标信息中包含欠曝区域的属性或非欠曝区域的属性。在一具体示例中,处理装置根据预设的欠曝区域不予清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建不包含欠曝区域的清洁路线;在沿清洁路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照清洁路线进行移动和清洁操作,由此不予清洁欠曝区域。在又一具体示例中,处理装置根据预设的欠曝区域滞后重点清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建先清洁非欠曝区域的第一段路线、自非欠曝区域的路线尾端移动至欠曝区域的第二段路线、以及清洁欠曝区域的第三段路线;在沿第一段路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照第一段路线进行移动和清洁操作;接着,处理装置按照所拍摄的图像进行定位并控制移动***沿第二段路线自第一段路线的尾端移动至第三段路线的起始端,在沿第二段路线移动期间控制清洁***不予执行清洁操作;再接着,处理装置按照预设的重点清洁模式在沿第三段路线移动期间控制清洁***执行相应的清洁操作。其中,处理装置在沿第三段路线移动期间,可仅利用清洁机器人中的移动传感器所提供的移动数据定位清洁机器人的当前位置及导航。
当经分析确定所述清洁机器人位于房间出入口且相应房间位于欠曝区域时,所述处理装置按照对应房间的清洁模式控制所述清洁机器人的行为。例如,处理装置在按照地标信息移动至无光照的房间入口时,会基于图像分析得到位于欠曝区域,所述处理装置可从所缓存的进入房间前的图像中所识别的定位特征,并从包含所识别定位特征的地标信息中得到门属性,从而确定欠曝区域为需要清洁的房间,故而,处理装置利用清洁机器人中的移动传感器所提供的移动数据计算所移动的距离和当前位置,并按照原导航路线控制所述清洁机器人中的移动***在房间内移动,同时,处理装置还控制清洁机器人的清洁***对房间的地面、墙边进行清洁。
此外,在某些实施方式中,处理装置还可以在进入欠曝区域后输出提示信息。例如,处理装置在进入欠曝区域后可以通过发出声音警报或者通过向用户的移动终端发送消息等方式来提醒用户清洁机器人已进入欠曝区域。
本申请还提供一种清洁机器人的控制方法。所述清洁机器人包含摄像装置、移动***和清洁***。
所述摄像装置与所述清洁机器人的控制***相连,所述摄像装置用来摄取图像以供所述控制***处理。所述摄像装置包括但不限于:照相机、视频摄像机、集成有光学***或CCD芯片的摄像模块、集成有光学***和CMOS芯片的摄像模块等。所述摄像装置的供电***可受清洁机器人的供电***控制,当机器人上电移动期间,所述摄像装置即开始拍摄图像。此外,所述 摄像装置可以设于清洁机器人的主体上。例如,所述摄像装置可以设于清洁机器人的顶盖的中部或边缘,或者所述摄像装置可以设于清洁机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。在某些实施例中,所述摄像装置可以位于清洁机器人的顶面,并且所述摄像装置的视野光学轴相对于垂线为±30°。例如,所述摄像装置位于清洁机器人顶面的中间位置或边缘,且其光学轴相对于垂线的夹角为-30°、-29°、-28°、-27°……-1°、0°、1°、2°……29°、或30°。需要说明的是,本领域技术人员应该理解,上述光学轴与垂线的夹角仅为举例,而非限制其夹角精度为1°的范围内,根据实际机器人的设计需求,所述夹角的精度可更高,如达到0.1°、0.01°以上等,在此不做无穷尽的举例。
所述移动***与所述清洁机器人的控制***相连,所述移动***基于所述控制***输出的控制指令驱动清洁机器人移动。在一实施例中,所述移动***包括驱动控制装置和至少两个滚轮组。其中,所述至少两个滚轮组中的至少一个滚轮组为受控滚轮组。所述驱动控制装置与所述控制***相连,所述驱动控制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述驱动控制装置包含驱动电机。所述驱动电机与所述滚轮组相连用于直接驱动滚轮组滚动。所述驱动控制装置还可以包含专用于控制驱动电机的一个或多个处理器(如CPU或微处理单元(MCU))。例如,所述微处理单元用于将所述控制***输出的控制指令转化为对驱动电机进行控制的电信号,并根据所述电信号控制所述驱动电机的转速、转向等以驱动清洁机器人移动。所述驱动控制装置中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述驱动控制装置中的处理器作为从处理设备,所述控制***中的处理器作为主设备,驱动控制装置基于控制***的控制进行移动控制。或者所述驱动控制装置中的处理器与所述控制***中的处理器相共用。驱动控制装置通过程序接口接收控制***所输出的控制指令。所述驱动控制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述清洁***与所述清洁机器人的控制***相连,所述清洁***基于所述控制***输出的控制指令进行清洁操作。在一实施例中,所述清洁***包括清洁组件和清洁驱动控制组件。其中,所述清洁驱动控制组件与所述控制***相连,所述清洁驱动控制组件基于所述控制***输出的控制指令驱动所述清洁组件清洁地面。
所述清洁组件可包括辊刷组件、过滤网、擦洗组件、吸入管道、集尘盒(或垃圾盒)、吸风电机等。所述辊刷组件和擦洗组件可根据清洁机器人的实际设计而择一配置或全部配置。所述辊刷组件包括但不限于:边刷、边刷驱动器、辊轮、辊轮驱动器等。所述擦洗组件包括但不限于:盛水容器、擦拭布、布的装配结构及所述装配结构的驱动器等。
所述清洁驱动控制组件可以包含专用于控制清洁组件的一个或多个处理器(如CPU或微处 理单元(MCU))。所述清洁驱动控制组件中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述清洁驱动控制组件中的处理器作为从处理设备,所述控制***中的处理器作为主设备,清洁驱动控制组件基于控制***输出的控制指令进行清洁操作。或者所述清洁驱动控制组件中的处理器与所述控制***中的处理器相共用。
请参阅图4,图4显示为本申请清洁机器人的控制方法在一种实施方式中的流程图。所述控制方法主要由控制***来执行。所述控制***可配置于清洁机器人中,所述控制***可如图1及其描述所示,或者其他能够执行所述控制方法的控制***。所述控制方法包括步骤S110、步骤S120以及步骤S130。
在此,可利用处理装置实现步骤S110、步骤S120以及步骤S130。处理装置可包括一个或多个处理器。处理装置可操作地与存储装置中的易失性存储器和/或非易失性存储器耦接。处理装置可执行在存储器和/或非易失性存储设备中存储的指令以在机器人中执行操作,诸如对拍摄的图像进行分析并基于分析结果控制清洁机器人的行为等。如此,处理器可包括一个或多个通用微处理器、一个或多个专用处理器(ASIC)、一个或多个数字信号处理器(DSP)、一个或多个现场可编程逻辑阵列(FPGA)、或它们的任何组合。所述处理装置还与I/O端口和输入结构可操作地耦接,该I/O端口可使得机器人能够与各种其他电子设备进行交互,该输入结构可使得用户能够与计算设备进行交互。因此,输入结构可包括按钮、键盘、鼠标、触控板等。所述其他电子设备可以是所述机器人中移动装置中的移动电机,或机器人中专用于控制移动装置的从处理器,如MCU(Microcontroller Unit,微控制单元,简称MCU)。
在一种示例中,所述处理装置通过数据线分别连接存储装置以及上述的摄像装置、移动***和清洁***。所述处理装置通过数据读写技术与存储装置进行交互,所述处理装置通过接口协议与摄像装置、移动***以及清洁***进行交互。其中,所述数据读写技术包括但不限于:高速/低速数据接口协议、数据库读写操作等。所述接口协议包括但不限于:HDMI接口协议、串行接口协议等。
在步骤S110中,在清洁机器人导航操作环境下控制摄像装置实时拍摄图像。
在此,处理装置在清洁机器人导航操作环境下,控制摄像装置实时拍摄图像。例如,摄像装置可以是用来拍摄静态图像或视频的摄像头。在一实施例中,清洁机器人可根据导航操作环境预先设定拍摄图像的时间间隔,然后处理装置控制摄像装置以预设的时间间隔来拍摄图像以获取不同时刻下的静态图像。在另一实施例中,处理装置控制摄像装置拍摄视频。
所述导航操作环境是指机器人依据当前定位及基于当前定位而确定的导航路线进行移动和执行相应操作的环境。具体地,清洁机器人的导航操作环境是指清洁机器人依据导航路线移动并进行清洁操作的环境。
在步骤S120中,对所拍摄的至少一幅图像进行分析。
在此,处理装置对所拍摄的至少一幅图像进行分析。其中,在摄像装置获取的是静态图像的情况下,处理装置可以对所获取的静态图像中的至少一幅图像进行分析。在摄像装置获取的是视频的情况下,由于视频是由图像帧构成的,因此处理装置首先可以连续或不连续地采集所获取的视频中的图像帧,然后选用一帧图像作为一幅图像进行分析。其中,处理装置可以对一幅或更多幅图像进行分析。
在步骤S130中,基于分析的结果并按照预设的清洁模式控制清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
在此,处理装置基于分析的结果并按照预设的清洁模式控制清洁机器人的行为。其中,所述分析的结果包括清洁机器人位于欠曝区域,或者机器人未位于欠曝区域。所述欠曝区域是物体遮住光源(太阳或聚光灯等)所发射的光线使得光线不能穿过不透明物体而形成的较暗区域如床下、沙发下等。另外,所述欠曝区域还可以指由于清洁机器人在位于光强过弱的一个区域时,导致进入其摄像装置的进光量过少而造成拍摄图像中包含亮度低于预设亮度阈值的情况,即导致图像欠曝。其中,所述拍摄图像中包含的亮度可由图像灰度值来描述,例如,所述处理装置检测到图像中包含灰度值小于预设灰度阈值的区域时将所述图像确定为欠曝图像,以此确定所述清洁机器人位于欠曝区域。在某些实施例中,还可以将所述亮度由摄像装置中光照感应器所提供的光强数值来描述,例如,所述处理装置获取图像以及对应的光强数据,当光强数据小于预设光强阈值时,所述处理装置确定清洁机器人位于欠曝区域。再或者,在某些实施例中,所述处理装置根据图像中的灰度值和光强数据来确定清洁机器人是否位于欠曝区域。例如,所述处理装置以同时满足上述两示例中的两种条件确定清洁机器人位于欠曝区域。
所述预设的清洁模式可以是预先设置的,并且由清洁机器人中的存储装置存储。所述清洁模式还可以通过预先对机器人的清洁区域进行限定而得到。在一实施例中,用户可以根据不同分类方式设置清洁区域。例如,根据欠曝情况,处理装置可以设置清洁区域包括欠曝区域、未欠曝区域。又如,用户根据房间规划,可以为清洁机器人设置不同分类的清洁区域,包括房间区域、客厅区域、厨房区域等,不同分类之间的区域可以存在交叠,例如,客厅区域中床下、沙发下等可以属于欠曝区域,而其他区域可以属于非欠曝区域。借助已设定的各清洁区域,用户可以向清洁机器人输入对应各清洁区域的清洁模式。其中,所述清洁模式包括对应欠曝区域的清洁模式。例如,对应欠曝区域的清洁模式可以设置为在清洁机器人位于欠曝区域的情况下停止清洁,或者在清洁机器人位于欠曝区域的情况下滞后清洁,或者在清洁机器人位于欠曝区域的情况下着重清洁,或者在清洁机器人位于欠曝区域的情况下持续清洁等。所述清洁方式可以包括清扫、吸尘、拖擦或其任意组合。所述滞后清洁可以是例如在清洁机器人位于欠曝区域 的情况下先不进行清洁转而继续清洁其他区域,待其他区域清洁完成后再返回清洁欠曝区域。所述着重清洁可以是通过调整清洁组件的清洁模式例如增大辊刷组件转速、通过液体施加器喷洒液体、增大擦洗组件压力以及增大真空气道的抽吸力等等至少一种手段来加强清洁效果。所述清洁模式还包括其他清洁模式,例如,在清洁机器人未位于欠曝区域的情况下持续清洁。
所述清洁机器人中的处理装置可基于用户输入而确定对应欠曝区域的清洁模式。在此,清洁机器人还可以包含人机交互装置,其亦与处理装置相连。用户可直接在清洁机器人所提供的人机交互装置上输入对应各欠曝区域的清洁模式。或者,清洁机器人包含与处理装置相连的网络装置,用户的其他智能终端(如手机、平板电脑、个人电脑等)可通过所述网络装置与处理装置进行数据传输,用户通过操作其他智能终端将所输入的对应各欠曝区域的清洁模式传递至处理装置,并由处理装置将相应的对应关系存储到清洁机器人的存储装置中。其中,所述欠曝区域可以是预先标定的;或经由清洁机器人通过图像分析得到的;或者而结合预先标定和图像分析所确定的。每个欠曝区域可对应统一的清洁模式,或各自单独对应清洁模式,再或者按照欠曝区域的分类对应清洁模式等。
所述清洁机器人的行为可以包括在清洁机器人位于欠曝区域时沿原导航路线移动且持续清洁、沿原导航路线移动且停止清洁、沿原导航路线移动且着重清洁、沿原导航路线移动且滞后清洁、修改导航路线且持续清洁等。所述清洁机器人的行为还可以包括在清洁机器人未位于欠曝区域时沿原导航路线持续清洁等。
需要说明的是,上述清洁模式以及清洁机器人的行为仅为举例,而非对本申请清洁模式和清洁机器人的行为的方式的限制。事实上,技术人员可以根据清洁机器人类型、用户需求等设置其他方式的清洁模式以及清洁机器人的行为,在此不再一一描述。
本申请清洁机器人的控制方法,通过对所拍摄的图像进行分析,基于分析的结果并按照预设的清洁模式来控制清洁机器人的行为的技术方案,使得对于某些特定区域,用户可以根据需求进行清洁,实现根据用户需求划区域进行清洁的目的。
为了能够准确定位清洁机器人的当前位置,在步骤S120中对所拍摄的至少一幅图像进行分析的步骤包括:利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置。其中,所述定位特征包括但不限于:形状特征、灰度特征等。所述形状特征包括但不限于:角点特征、直线特征、边缘特征、曲线特征等。所述灰度特征包括但不限于:灰度跳变特征、高于或低于灰度阈值的灰度值、图像帧中包含预设灰度范围的区域尺寸等。此外,为了清洁机器人能够基于图像获取到足够多的定位特征,所述图像中所能识别的定位特征数量通常为多个,例如10个以上。
在一实施例中,针对利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位 置的实现方式,例如,通过对所拍摄图像中实物的图形进行识别并与标准件的图形进行匹配、以及基于所述标准件的标准物理特征来确定机器人在当前物理空间中的定位信息。又如,通过对图像中所识别的特征与预设的地图数据中的地标信息中的特征的匹配来确定机器人在当前物理空间中的定位信息。在另一实施例中,针对利用在至少两幅图像所识别的定位特征确定清洁机器人的当前位置的实现方式,例如,利用两图像帧中所匹配的特征点的位置偏移信息来确定机器人的位置及姿态。在再一实施例中,利用在至少一幅图像所识别的定位特征及预先建立的地标信息对所述清洁机器人进行定位。其中,所述地标信息可以是在历次导航期间收集的并对应于地图数据中定位点的一种属性信息,其包括但不限于:在地图数据的某个定位点上摄像装置所能摄取的定位特征、历次拍摄到所述定位特征时在物理空间的地图数据、历次拍摄到所述定位特征时在相应图像帧中的位置、拍摄相应定位特征时清洁机器人的位置及姿态等属性信息。所述地标信息可与地图数据一并保存在所述存储装置。所述地图数据可以是基于SLAM(Simultaneous Localization and Mapping,即时定位与地图构建)或VSLAM技术预先构建而得到的。
另外,在步骤S120中,在利用所拍摄的至少一幅图像进行定位分析时,不必然有时序限制地,还利用至少一幅图像中的灰度特征分析所述清洁机器人是否位于欠曝区域。在实际应用中,当清洁机器人位于欠曝区域时,通常清洁机器人位于床下、沙发下等位置。针对清洁机器人位于床下、沙发下的情况,由于床下、沙发下通常积尘过多,所以用户可能不想对该区域进行清洁或者用户希望在清洁完其他区域后再对该区域进行清洁。
为此,对机器人是否位于欠曝区域进行分析。在此,处理装置通过获取摄像装置所拍摄的图像进行欠曝区域分析。其中,通常摄像装置所拍摄的图像为RGB颜色模式,因此处理装置需先对所拍摄的图像进行灰度化处理得到灰度图像,然后再对灰度图像进行欠曝分析以确定清洁机器人是否位于欠曝区域。在此,处理装置可采用分量法、最大值法、平均值法或者加权平均法等对所拍摄的图像进行灰度化处理获得灰度图像。灰度图像是一种具有从黑到白256级灰度色阶或等级的单色图像,255表示白色,0表示黑色。
在一实施例中,处理装置通过分析灰度图像中的灰度分布、灰度均值、灰度极大极小值等来确定清洁机器人是否位于欠曝区域。在一示例中,在选用灰度分布来表征灰度特征的情况下,当处理装置对灰度图像中灰度分布进行分析后得到灰度分布集中在预设的灰度欠曝区间时,处理装置确定清洁机器人位于欠曝区域。例如,灰度欠曝区间可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。
在一具体示例中,摄像装置被选用为包括可调光圈的摄像装置。处理装置可以通过增大进光量来改变摄像装置所获取的光线的强度。在对进光量进行补偿之后,处理装置对调整后的摄 像装置所拍摄的图像进行分析。其中,光圈通常设置在摄像装置内,用于调节进入摄像装置中光线的多少。例如,当摄像装置位于欠曝区域时可自动通过增大所述光圈来增加进光量,然后,处理装置对调整后摄像装置所摄取的至少一幅图像进行上述欠曝分析,以提高分析结果的准确性。上述处理装置分析所拍摄的图像的方式虽然能够及时响应对应欠曝区域的清洁模式,但是在一些实际应用中,由于这种摄像装置在不同光强下会自动调整光圈,这在光强变化时可能会因为光圈调整不及时而导致所拍摄的图像灰度普遍较低的情况。为了防止处理装置对单一图像的分析不准确的问题,可通过对多幅图像进行欠曝分析,以提高欠曝分析的准确性。
此外,在另一些实际应用中,还存在摄像装置在短时间段内被物体遮挡的情况。在摄像装置被遮挡的情况下,为防止遮挡对处理装置分析的结果的干扰,处理装置可以对预设时长内所拍摄的多幅图像进行分析以确定所述清洁机器人是否位于欠曝区域。其中,预设时长可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。一般地,预设时长可以在毫秒级。例如,所述处理装置在检测到一幅图像欠曝时,继续检测后续摄像装置所摄取的至少一幅图像也呈现欠曝特性,进而确定机器人位于欠曝区域。
在另一种实施方式中,还可以通过对来自感光元件的感光信息进行分析以确定所述清洁机器人是否位于欠曝区域。其中,所述感光元件包括但不限于:光敏电阻、光敏三极管、光电倍增管、CCD元件、CMOS器件等。此外,所述感光元件可以设于清洁机器人的主体上。例如,所述感光元件可以设于清洁机器人的顶盖的沿行进方向的前边缘处,或者可以在清洁机器人的顶盖的边缘处间隔设置多个感光元件,以使得当清洁机器人的一部分进入欠曝区域时即可确定清洁机器人位于欠曝区域。在某些实施例中,首先,感光元件将所感应的光强转化为光强数据输出给处理装置,然后,处理装置对所述光强数据进行分析以确定清洁机器人是否位于欠曝区域。例如,处理装置预先设置欠曝阈值,当所得到的光强数据低于所述欠曝阈值时,处理装置确定清洁机器人位于欠曝区域;反之,则确定清洁机器人未位于欠曝区域。其中,所述欠曝阈值可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。
需要说明的是,基于上述一种或多种分析方式而确定清洁机器人是否位于欠曝区域仅为举例,而非对本申请确定位于欠曝区域的方式的限制。事实上,技术人员还可以利用多种灰度分析方法并结合感光元件方法所得到结果进一步对清洁机器人是否位于欠曝区域进行评价等方式来最终确定清洁机器人是否位于欠曝区域。在此不再一一描述。然而,基于本申请所提及的任一种图像灰度值而进行分析,或在此基础上改进以确定的清洁机器人位于欠曝区域的方式应视为本申请的一种具体示例。
当处理装置基于预先规划的路线控制移动***移动至欠曝区域时,可采用对应的清洁模式进行移动和清洁操作。其中,在步骤S130中基于分析的结果并按照预设的清洁模式控制所述清 洁机器人的行为的方式包括以下任一种:
1)调整所述清洁机器人的导航路线以离开所述欠曝区域。请参阅图2,图2显示为本申请清洁机器人针对欠曝区域的情况在一种实施方式中的示意图。如图所示,本示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,处理装置修改导航路线并控制移动***离开欠曝区域,本示例中,清洁机器人沿图中转向箭头B所示偏转180°后继续移动,以使得清洁机器人离开床下并继续沿箭头C所示的方向移动且清洁其他非欠曝区域。需要说明的是,本领域技术人员应该理解,图2中的清洁机器人沿图中转向箭头B所示偏转180°后继续移动仅为举例,所述偏转的角度并非仅为180°这一种情况,根据实际需求,所述偏转的角度的可灵活设置多个,在此不做无穷尽的举例。
2)按照原导航路线控制所述清洁机器人经过所述欠曝区域,以及按照预设的对应欠曝区域的清洁模式控制所述清洁机器人清洁所经过的欠曝区域。请参阅图3,图3显示为本申请清洁机器人针对欠曝区域的情况在另一种实施方式中的示意图。如图所示,在一示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***不予清洁当前所位于的欠曝区域,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行清洁操作。又如,在另一示例中,预设的对应欠曝区域的清洁模式为重点清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***执行对应重点清洁所位于的欠曝区域的清洁操作,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行常规的清洁操作。
在如未开灯的房间等需要清洁且光源未照射的空间情形下,处理装置仅根据灰度特征或感光元件的感光信息来控制清洁机器人采用不予清洁的清洁模式来控制清洁***和移动***是不准确的。为了综合适配各种实际应用场景,在存储装置中保存有地图数据和地标信息,其中,所述地标信息不仅包含上述各属性信息,还包括定位特征所表征的物体信息、对应欠曝区域的 属性等。所述控制方法还包括在导航操作环境下在预先构建的地图数据中构建欠曝区域的地标信息的步骤。当处理装置确定清洁机器人由非欠曝区域移动至欠曝区域,或由欠曝区域移动至非欠曝区域时,在预先构建的地图数据中增加属性信息。例如,处理装置根据实时定位而确定欠曝区域和非欠曝区域的边界位置;接着,所述处理装置在地图数据中边界位置增加包含欠曝区域的属性,或者在边界位置附近已存在的地标信息中增加欠曝区域的属性。
在实际应用中,由于室内的摆设、光源的摄取都可能发生变化,例如,先前未放置沙发使得当前位置被收录为地标信息的特征为未欠曝,但当沙发被搬移到此后使得清洁机器人在移动至该位置时分析出了位于欠曝区域。再如,阳光射入房间时,相应房间不属于欠曝区域,然而当由白天转为夜晚且未开灯照明情况下,该房间为欠曝区域。为此,所述控制方法还包括在预先构建的地图数据中更新相应欠曝区域的地标信息的步骤。例如,处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于欠曝区域且相应的地标信息为非欠曝区域的属性时,更新相应的地标信息;或者处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于非欠曝区域且相应的地标信息为欠曝区域的属性时,更新相应的地标信息。
在包含了上述地标信息的基础上,所述按照预设的清洁模式控制清洁机器人的行为的步骤包括:基于预设的清洁模式、在至少一幅图像中所识别的定位信息及所述地标信息导航所述清洁机器人。其中,所述地标信息中包含欠曝区域的属性或非欠曝区域的属性。在一具体示例中,处理装置根据预设的欠曝区域不予清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建不包含欠曝区域的清洁路线;在沿清洁路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照清洁路线进行移动和清洁操作,由此不予清洁欠曝区域。在又一具体示例中,处理装置根据预设的欠曝区域滞后重点清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建先清洁非欠曝区域的第一段路线、自非欠曝区域的路线尾端移动至欠曝区域的第二段路线、以及清洁欠曝区域的第三段路线;在沿第一段路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照第一段路线进行移动和清洁操作;接着,处理装置按照所拍摄的图像进行定位并控制移动***沿第二段路线自第一段路线的尾端移动至第三段路线的起始端,在沿第二段路线移动期间控制清洁***不予执行清洁操作;再接着,处理装置按照预设的重点清洁模式在沿第三段路线移动期间控制清洁***执行相应的清洁操作。其中,处理装置在沿第三段路线移动期间,可仅利用清洁机器人中的移动传感器所提供的移动数据定位清洁机器人的当前位置及导航。
当经分析确定所述清洁机器人位于房间出入口且相应房间位于欠曝区域时,所述处理装置按照对应房间的清洁模式控制所述清洁机器人的行为。例如,处理装置在按照地标信息移动至 无光照的房间入口时,会基于图像分析得到位于欠曝区域,所述处理装置可从所缓存的进入房间前的图像中所识别的定位特征,并从包含所识别定位特征的地标信息中得到门属性,从而确定欠曝区域为需要清洁的房间,故而,处理装置利用清洁机器人中的移动传感器所提供的移动数据计算所移动的距离和当前位置,并按照原导航路线控制所述清洁机器人中的移动***在房间内移动,同时,处理装置还控制清洁机器人的清洁***对房间的地面、墙边进行清洁。
此外,在某些实施方式中,所述控制方法还包括在进入欠曝区域后输出提示信息的步骤。例如,处理装置在进入欠曝区域后可以通过发出声音警报或者通过向用户的移动终端发送消息等方式来提醒用户清洁机器人已进入欠曝区域。
本申请还提供一种清洁机器人,请参阅图5,图5显示为本申请的清洁机器人在一种实施方式中的结构示意图。如图所示,清洁机器人包括:控制***21、摄像装置22、移动***23以及清洁***24。摄像装置22、移动***23以及清洁***24均与控制***21相连。控制***21用于基于对所拍摄的图像的分析结果输出对应于预设清洁模式的控制指令,摄像装置22用于摄取图像以供所述控制***处理,移动***23用于基于所述控制指令驱动清洁机器人移动,清洁***24用于基于所述控制指令进行清洁操作。其中,所述控制指令包括但不限于基于导航路线和当前位置而确定的移动方向、移动速度、移动距离等,按照预设的清洁模式而进行的清洁操作等。
所述摄像装置22包括但不限于:照相机、视频摄像机、集成有光学***或CCD芯片的摄像模块、集成有光学***和CMOS芯片的摄像模块等。所述摄像装置的供电***可受清洁机器人的供电***控制,当机器人上电移动期间,所述摄像装置即开始拍摄图像。此外,所述摄像装置可以设于清洁机器人的主体上。例如,所述摄像装置可以设于清洁机器人的顶盖的中部或边缘,或者所述摄像装置可以设于清洁机器人的顶部表面的平面之下、在主体的几何中心附近或主体的边缘附近的凹入结构上。
在某些实施例中,所述摄像装置可以位于清洁机器人的顶面,并且所述摄像装置的视野光学轴相对于垂线为±30°。例如,所述摄像装置位于清洁机器人顶面的中间位置或边缘,且其光学轴相对于垂线的夹角为-30°、-29°、-28°、-27°……-1°、0°、1°、2°……29°、或30°。需要说明的是,本领域技术人员应该理解,上述光学轴与垂线的夹角仅为举例,而非限制其夹角精度为1°的范围内,根据实际机器人的设计需求,所述夹角的精度可更高,如达到0.1°、0.01°以上等,在此不做无穷尽的举例。
在某些实施例中,所述摄像装置包含可调光圈,所述可调光圈受所述控制***控制,以供所述控制***获取增加了进光量的图像。例如,处理装置可以通过增大进光量来改变摄像装置所获取的光线的强度。在对进光量进行补偿之后,处理装置对调整后的摄像装置所拍摄的图像 进行分析。其中,光圈通常设置在摄像装置内,用于调节进入摄像装置中光线的多少。例如,当摄像装置位于欠曝区域时可自动通过增大所述光圈来增加进光量,然后,处理装置对调整后摄像装置所摄取的图像进行欠曝分析,以提高分析结果的准确性。
所述移动***23受控制***21的控制而移动,其位于清洁机器人的底部。在一种实施方式中,所述移动***23包括驱动控制装置和至少两个滚轮组。其中,所述至少两个滚轮组中的至少一个滚轮组为受控滚轮组。所述驱动控制装置与所述控制***相连,所述驱动控制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述驱动控制装置包含驱动电机。所述驱动电机与所述滚轮组相连用于直接驱动滚轮组滚动。所述驱动控制装置还可以包含专用于控制驱动电机的一个或多个处理器(如CPU或微处理单元(MCU))。例如,所述微处理单元用于将所述控制***输出的控制指令转化为对驱动电机进行控制的电信号,并根据所述电信号控制所述驱动电机的转速、转向等以驱动清洁机器人移动。所述驱动控制装置中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述驱动控制装置中的处理器作为从处理设备,所述控制***中的处理器作为主设备,驱动控制装置基于控制***的控制进行移动控制。或者所述驱动控制装置中的处理器与所述控制***中的处理器相共用。驱动控制装置通过程序接口接收控制***所输出的控制指令。所述驱动控制装置基于所述控制***输出的控制指令驱动所述受控滚轮组滚动。
所述清洁***24受控制***21的控制进行清洁操作。在一种实施方式中,所述清洁***24包括清洁组件和清洁驱动控制组件。其中,所述清洁驱动控制组件与所述控制***相连,所述清洁驱动控制组件基于所述控制***输出的控制指令驱动所述清洁组件清洁地面。
所述清洁组件可包括辊刷组件、过滤网、擦洗组件、吸入管道、集尘盒(或垃圾盒)、吸风电机等。所述辊刷组件和擦洗组件可根据清洁机器人的实际设计而择一配置或全部配置。所述辊刷组件包括但不限于:边刷、边刷驱动器、辊轮、辊轮驱动器等。所述擦洗组件包括但不限于:盛水容器、擦拭布、布的装配结构及所述装配结构的驱动器等。
所述清洁驱动控制组件可以包含专用于控制清洁组件的一个或多个处理器(如CPU或微处理单元(MCU))。所述清洁驱动控制组件中的处理器可以和所述控制***中的处理器共用或可独立设置。例如,所述清洁驱动控制组件中的处理器作为从处理设备,所述控制***中的处理器作为主设备,清洁驱动控制组件基于控制***输出的控制指令进行清洁操作。或者所述清洁驱动控制组件中的处理器与所述控制***中的处理器相共用。
所述控制***可如图1所示并结合前述基于该图1所对应的说明进行控制处理,在此不再详述。其中,图5中所示的存储装置211可对应于图1中所述的存储装置11;图5中所示的处理装置213可对应于图1中所述的处理装置13。以图5中所示的控制***21包含存储装置211 和处理装置213,所述控制***21连接移动***23、清洁***24、摄像装置22为例,对所述清洁机器人的控制***21用于基于对所拍摄的图像的分析结果输出对应于预设清洁模式的控制指令的工作过程予以描述:
所述存储装置211存储有一个或多个程序。所述程序包括稍后描述的由处理装置213调取以执行控制、分析等步骤的相应程序。所述处理装置213通过调取存储装置211中所存储的程序来进行控制处理。
首先,处理装置在清洁机器人导航操作环境下控制摄像装置实时拍摄图像。例如,摄像装置可以是用来拍摄静态图像或视频的摄像头。在某些实施方式中,摄像装置还可以包括可调光圈。处理装置可以通过调整所述光圈来增大进光量以改变摄像装置所获取的光线的强度。在对进光量进行补偿之后,处理装置控制摄像装置拍摄图像,并且在后续处理中,处理装置对调整后的摄像装置所拍摄的图像进行分析。
然后,处理装置对所拍摄的至少一幅图像进行分析。其中,在摄像装置获取的是静态图像的情况下,处理装置可以对所获取的静态图像中的至少一幅图像进行分析。在摄像装置获取的是视频的情况下,处理装置可以连续或不连续地采集所获取的视频中的图像帧,然后选用一帧图像作为一幅图像进行分析。在某些实施方式中,处理装置还可以对预设时长内所拍摄的多幅图像进行分析。其中,预设时长可以根据技术经验或者实验设计得知并预先存储在清洁机器人中。一般地,预设时长可以在毫秒级。
处理装置可以利用在至少一幅图像所识别的定位特征确定清洁机器人的当前位置。在另一实施例中,处理装置可以利用在至少一幅图像所识别的定位特征及预先建立的地标信息对所述清洁机器人进行定位。其中,所述地标信息可以是在历次导航期间收集的并对应于地图数据中定位点的一种属性信息,其包括但不限于:在地图数据的某个定位点上摄像装置所能摄取的定位特征、历次拍摄到所述定位特征时在物理空间的地图数据、历次拍摄到所述定位特征时在相应图像帧中的位置、拍摄相应定位特征时清洁机器人的位置及姿态等属性信息。所述地标信息可与地图数据一并保存在所述存储装置。所述地图数据可以是基于SLAM(Simultaneous Localization and Mapping,同步定位与建图)或VSLAM技术预先构建而得到的。
此外,处理装置利用至少一幅图像中的灰度特征分析所述清洁机器人是否位于欠曝区域。在另一实施方式中,处理装置还可以通过对来自感光元件的感光信息进行分析以确定所述清洁机器人是否位于欠曝区域。
接着,处理装置基于分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
在某些实施例中,所述清洁机器人中的处理装置可基于用户输入而确定对应欠曝区域的清 洁模式。此外,处理装置基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为的方式包括以下任一种:1)调整所述清洁机器人的导航路线以离开所述欠曝区域。请参阅图2,图2显示为本申请清洁机器人针对欠曝区域的情况在一种实施方式中的示意图。如图所示,本示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,处理装置修改导航路线并控制移动***离开欠曝区域,本示例中,清洁机器人沿图中转向箭头B所示偏转180°后继续移动,以使得清洁机器人离开床下并继续沿箭头C所示的方向移动且清洁其他非欠曝区域,需要说明的是,本领域技术人员应该理解,图2中的清洁机器人沿图中转向箭头B所示偏转180°后继续移动仅为举例,所述偏转的角度并非仅为180°这一种情况,根据实际需求,所述偏转的角度的可灵活设置多个,在此不做无穷尽的举例。2)按照原导航路线控制所述清洁机器人经过所述欠曝区域,以及按照预设的对应欠曝区域的清洁模式控制所述清洁机器人清洁所经过的欠曝区域。请参阅图3,图3显示为本申请清洁机器人针对欠曝区域的情况在另一种实施方式中的示意图。如图所示,在一示例中,欠曝区域指床下区域,预设的对应欠曝区域的清洁模式为不予清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***不予清洁当前所位于的欠曝区域,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行清洁操作。又如,在另一示例中,预设的对应欠曝区域的清洁模式为重点清洁,当清洁机器人工作时,清洁机器人根据原导航路线沿图中箭头A所示的方向移动并进行清洁操作,在清洁机器人的处理装置当按照上述任一种方式分析得到清洁机器人位于欠曝区域即床下时,一方面控制移动***按照原路线沿图中箭头B所示的方向进行移动操作,同时还控制清洁***执行对应重点清洁所位于的欠曝区域的清洁操作,直至检测到清洁机器人位于非欠曝区域时,继续控制移动***按照原路线沿图中箭头C所示的方向进行移动操作并且控制清洁***按照对应非欠曝清洁区域的清洁模式进行常规的清洁操作。
此外,在某些实施方式中,处理装置还可以在进入欠曝区域后通过发出声音警报或者通过向用户的移动终端发送消息等方式来提醒用户清洁机器人已进入欠曝区域。
在实际应用中,例如,在如未开灯的房间等需要清洁且光源未照射的空间情形下,处理装置仅根据灰度特征或感光元件的感光信息来控制清洁机器人采用不予清洁的清洁模式来控制清洁***和移动***是不准确的。为了综合适配各种实际应用场景,所述存储装置中保存有地图 数据和地标信息,其中,所述地标信息不仅包含上述各属性信息,还包括定位特征所表征的物体信息、对应欠曝区域的属性等。所述处理装置还执行在导航操作环境下还在预先构建的地图数据中构建欠曝区域的地标信息的步骤。当处理装置确定清洁机器人由非欠曝区域移动至欠曝区域,或由欠曝区域移动至非欠曝区域时,在预先构建的地图数据中增加属性信息。
此外,由于室内的摆设、光源的摄取都可能发生变化,为此,所述处理装置还执行在预先构建的地图数据中更新相应欠曝区域的地标信息的步骤。例如,处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于欠曝区域且相应的地标信息为非欠曝区域的属性时,更新相应的地标信息;或者处理装置当通过图像或感光装置进行欠曝分析后确定清洁机器人处于非欠曝区域且相应的地标信息为欠曝区域的属性时,更新相应的地标信息。
在包含了上述地标信息的基础上,所述处理装置基于预设的清洁模式、在至少一幅图像中所识别的定位信息及所述地标信息导航所述清洁机器人。其中,所述地标信息中包含欠曝区域的属性或非欠曝区域的属性。在一具体示例中,处理装置根据预设的欠曝区域不予清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建不包含欠曝区域的清洁路线;在沿清洁路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照清洁路线进行移动和清洁操作,由此不予清洁欠曝区域。在又一具体示例中,处理装置根据预设的欠曝区域滞后重点清洁、非欠曝区域予以清洁的清洁模式,以及地图数据中各包含欠曝区域属性的地标信息,构建先清洁非欠曝区域的第一段路线、自非欠曝区域的路线尾端移动至欠曝区域的第二段路线、以及清洁欠曝区域的第三段路线;在沿第一段路线上所拍摄到的至少一幅图像来确定清洁机器人当前位置,并控制移动***沿按照第一段路线进行移动和清洁操作;接着,处理装置按照所拍摄的图像进行定位并控制移动***沿第二段路线自第一段路线的尾端移动至第三段路线的起始端,在沿第二段路线移动期间控制清洁***不予执行清洁操作;再接着,处理装置按照预设的重点清洁模式在沿第三段路线移动期间控制清洁***执行相应的清洁操作。其中,处理装置在沿第三段路线移动期间,可仅利用清洁机器人中的移动传感器所提供的移动数据定位清洁机器人的当前位置及导航。
当经分析确定所述清洁机器人位于房间出入口且相应房间位于欠曝区域时,所述处理装置按照对应房间的清洁模式控制所述清洁机器人的行为。例如,处理装置在按照地标信息移动至无光照的房间入口时,会基于图像分析得到位于欠曝区域,所述处理装置可从所缓存的进入房间前的图像中所识别的定位特征,并从包含所识别定位特征的地标信息中得到门属性,从而确定欠曝区域为需要清洁的房间,故而,处理装置利用清洁机器人中的移动传感器所提供的移动数据计算所移动的距离和当前位置,并按照原导航路线控制所述清洁机器人中的移动***在房间内移动,同时,处理装置还控制清洁机器人的清洁***对房间的地面、墙边进行清洁。
另外需要说明的是,通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到本申请的部分或全部可借助软件并结合必需的通用硬件平台来实现。基于这样的理解,本申请还提供一种计算机设备的存储介质,所述存储介质存储有至少一个程序,所述程序被处理器执行时执行前述的任一所述的控制方法。
基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可包括其上存储有机器可执行指令的一个或多个机器可读介质,这些指令在由诸如计算机、计算机网络或其他电子设备等一个或多个机器执行时可使得该一个或多个机器根据本申请的实施例来执行操作。例如执行机器人的定位方法中的各步骤等。机器可读介质可包括,但不限于,软盘、光盘、CD-ROM(紧致盘-只读存储器)、磁光盘、ROM(只读存储器)、RAM(随机存取存储器)、EPROM(可擦除可编程只读存储器)、EEPROM(电可擦除可编程只读存储器)、磁卡或光卡、闪存、或适于存储机器可执行指令的其他类型的介质/机器可读介质。其中,所述存储介质可位于机器人也可位于第三方服务器中,如位于提供某应用商城的服务器中。在此对具体应用商城不做限制,如小米应用商城、华为应用商城、苹果应用商城等。
本申请可用于众多通用或专用的计算***环境或配置中。例如:个人计算机、服务器计算机、手持设备或便携式设备、平板型设备、多处理器***、基于微处理器的***、置顶盒、可编程的消费电子设备、网络PC、小型计算机、大型计算机、包括以上任何***或设备的分布式计算环境等。
本申请可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本申请,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
上述实施例仅例示性说明本申请的原理及其功效,而非用于限制本申请。任何熟悉此技术的人士皆可在不违背本申请的精神及范畴下,对上述实施例进行修饰或改变。因此,举凡所属技术领域中具有通常知识者在未脱离本申请所揭示的精神与技术思想下所完成的一切等效修饰或改变,仍应由本申请的权利要求所涵盖。

Claims (32)

  1. 一种清洁机器人的控制方法,其中,所述清洁机器人包含摄像装置、移动***和清洁***,其特征在于,所述控制方法包括以下步骤:
    在清洁机器人导航操作环境下,控制所述摄像装置实时拍摄图像;
    对所拍摄的至少一幅图像进行分析;
    基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
  2. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,所述对所拍摄的至少一幅图像进行分析的步骤包括:利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置。
  3. 根据权利要求2所述的清洁机器人的控制方法,其特征在于,所述利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置的步骤包括:利用在至少一幅图像所识别的定位特征及预先建立的地标信息对所述清洁机器人进行定位。
  4. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,所述对所拍摄的至少一幅图像进行分析的步骤包括:利用至少一幅图像中的灰度特征分析所述清洁机器人是否位于欠曝区域。
  5. 根据权利要求1或4所述的清洁机器人的控制方法,其特征在于,还包括:通过对来自感光元件的感光信息的分析以确定所述清洁机器人是否位于欠曝区域。
  6. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,还包括:调整所述摄像装置的进光量的步骤,以便对调整后的所述摄像装置所摄取的至少一幅图像进行分析。
  7. 根据权利要求6所述的清洁机器人的控制方法,其特征在于,所述调整所述摄像装置的进光量的步骤包括:通过调整摄像装置的光圈增加所述进光量。
  8. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,所述对所拍摄的至少一幅图像进行分析的步骤包括:对预设时长内所拍摄的多幅图像进行分析以确定所述清洁机器人是否位于欠曝区域。
  9. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,还包括:在预先构建的地图数据中构建欠曝区域的地标信息的步骤;或者在预先构建的地图数据中更新相应欠曝区域的地标信息的步骤。
  10. 根据权利要求9所述的清洁机器人的控制方法,其特征在于,所述按照预设的清洁模式控制所述清洁机器人的行为的步骤包括:基于预设的清洁模式、在至少一幅图像中所识别的定位信息及所述地标信息导航所述清洁机器人。
  11. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,还包括基于用户输入而确定对应欠曝区域的清洁模式的步骤。
  12. 根据权利要求1至11中任一所述的清洁机器人的控制方法,其特征在于,所述基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为的步骤包括以下任一种:
    调整所述清洁机器人的导航路线以离开所述欠曝区域;
    按照原导航路线控制所述清洁机器人经过所述欠曝区域,以及按照预设的对应欠曝区域的清洁模式控制所述清洁机器人清洁所经过的欠曝区域。
  13. 根据权利要求1所述的清洁机器人的控制方法,其特征在于,还包括在进入欠曝区域后输出提示信息的步骤。
  14. 一种清洁机器人的控制***,其中所述清洁机器人包含摄像装置、移动***和清洁***,其特征在于,包括:
    存储装置,存储有一个或多个程序;
    处理装置,与所述存储装置相连,通过调取所述一个或多个程序以执行以下步骤:
    在清洁机器人导航操作环境下,控制所述摄像装置实时拍摄图像;
    对所拍摄的至少一幅图像进行分析;
    基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为;其中,所述清洁模式包括对应欠曝区域的清洁模式。
  15. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置执行对所拍摄的至少一幅图像进行分析的步骤包括:利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置。
  16. 根据权利要求15所述的清洁机器人的控制***,其特征在于,所述处理装置执行利用在至少一幅图像所识别的定位特征确定所述清洁机器人的当前位置的步骤包括:利用在至少一幅图像所识别的定位特征及预先建立的地标信息对所述清洁机器人进行定位的步骤。
  17. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置执行对所拍摄的至少一幅图像进行分析的步骤包括:利用至少一幅图像中的灰度特征分析所述清洁机器人是否位于欠曝区域。
  18. 根据权利要求14或17所述的清洁机器人的控制***,其特征在于,所述处理装置还执行通过对来自感光元件的感光信息的分析以确定所述清洁机器人是否位于欠曝区域的步骤。
  19. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置还执行调整所述摄像装置的进光量的步骤,以便对调整后的所述摄像装置所摄取的至少一幅图像进行分 析。
  20. 根据权利要求19所述的清洁机器人的控制***,其特征在于,所述处理装置执行调整所述摄像装置的进光量的步骤包括:通过调整摄像装置的光圈增加所述进光量。
  21. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置执行对所拍摄的至少一幅图像进行分析的步骤包括:对预设时长内所拍摄的多幅图像进行分析以确定所述清洁机器人是否位于欠曝区域。
  22. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置还执行在预先构建的地图数据中构建欠曝区域的地标信息的步骤;或者在预先构建的地图数据中更新相应欠曝区域的地标信息的步骤。
  23. 根据权利要求22所述的清洁机器人的控制***,其特征在于,所述按照预设的清洁模式控制所述清洁机器人的行为的步骤包括:基于预设的清洁模式、在至少一幅图像中所识别的定位信息及所述地标信息导航所述清洁机器人。
  24. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置还执行基于用户输入而确定对应欠曝区域的清洁模式的步骤。
  25. 根据权利要求14至24中任一所述的清洁机器人的控制方法,其特征在于,所述处理装置基于所述分析的结果并按照预设的清洁模式控制所述清洁机器人的行为的步骤包括以下任一种:
    调整所述清洁机器人的导航路线以离开所述欠曝区域;
    按照原导航路线控制所述清洁机器人经过所述欠曝区域,以及按照预设的对应欠曝区域的清洁模式控制所述清洁机器人清洁所经过的欠曝区域。
  26. 根据权利要求14所述的清洁机器人的控制***,其特征在于,所述处理装置还执行在进入欠曝区域后输出提示信息的步骤。
  27. 一种清洁机器人,其特征在于,包括:
    如权利要求14-26中任一所述的控制***,用于基于对所拍摄的图像的分析结果输出对应于预设清洁模式的控制指令;
    摄像装置,与所述控制***相连,用于摄取图像以供所述控制***处理;
    移动***,与所述控制***相连,用于基于所述控制指令驱动清洁机器人移动;
    清洁***,与所述控制***相连,用于基于所述控制指令进行清洁操作。
  28. 根据权利要求27所述的清洁机器人,其特征在于,所述摄像装置位于清洁机器人的顶面,且所述摄像装置的视野光学轴相对于垂线为±30°。
  29. 根据权利要求27所述的清洁机器人,其特征在于,所述摄像装置包含可调光圈,所述可调光圈受所述控制***控制,以供所述控制***获取增加了进光量的图像。
  30. 根据权利要求27所述的清洁机器人,其特征在于,所述移动***包括:
    至少两个滚轮组,其中,至少一个滚轮组为受控滚轮组;
    驱动控制装置,与所述控制***相连,用于基于所述控制指令驱动所述受控滚轮组滚动。
  31. 根据权利要求27所述的清洁机器人,其特征在于,所述清洁***包括:
    清洁组件;
    清洁驱动控制组件,与所述控制***相连,用于基于所述控制指令驱动所述清洁组件清洁地面。
  32. 一种计算机设备的存储介质,其特征在于,存储有至少一个程序,所述程序被处理器执行时执行如权利要求1-13中任一所述方法。
PCT/CN2018/090659 2017-12-15 2018-06-11 控制方法、***及所适用的清洁机器人 WO2019114221A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/131,335 US10293489B1 (en) 2017-12-15 2018-09-14 Control method and system, and cleaning robot using the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711347544.5A CN108125622A (zh) 2017-12-15 2017-12-15 控制方法、***及所适用的清洁机器人
CN201711347544.5 2017-12-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/131,335 Continuation US10293489B1 (en) 2017-12-15 2018-09-14 Control method and system, and cleaning robot using the same

Publications (1)

Publication Number Publication Date
WO2019114221A1 true WO2019114221A1 (zh) 2019-06-20

Family

ID=62389504

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090659 WO2019114221A1 (zh) 2017-12-15 2018-06-11 控制方法、***及所适用的清洁机器人

Country Status (2)

Country Link
CN (1) CN108125622A (zh)
WO (1) WO2019114221A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022208869A1 (de) 2022-08-26 2024-02-29 BSH Hausgeräte GmbH Verfahren zum Erstellen einer Umgebungskarte

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110174888B (zh) * 2018-08-09 2022-08-12 深圳瑞科时尚电子有限公司 自移动机器人控制方法、装置、设备及存储介质
CN114847803B (zh) * 2018-10-29 2024-04-16 北京石头创新科技有限公司 机器人的定位方法及装置、电子设备、存储介质
CN109452914A (zh) * 2018-11-01 2019-03-12 北京石头世纪科技有限公司 智能清洁设备,清洁模式选择方法,计算机存储介质
CN111134570A (zh) * 2018-11-02 2020-05-12 添可智能科技有限公司 具备图像采集功能的吸尘器以及摄像设备
CN109643127B (zh) * 2018-11-19 2022-05-03 深圳阿科伯特机器人有限公司 构建地图、定位、导航、控制方法及***、移动机器人
CN109396086B (zh) * 2018-11-27 2023-10-31 长安大学 一种全自动多自由度路灯清洗装置及其控制方法
CN109871420B (zh) * 2019-01-16 2022-03-29 深圳乐动机器人有限公司 地图生成和分区方法、装置及终端设备
CN114390904A (zh) * 2019-09-05 2022-04-22 Lg电子株式会社 机器人清洁器及其控制方法
CN112733571B (zh) * 2019-10-14 2024-05-17 杭州萤石软件有限公司 机器人的控制方法、设备和存储介质
CN111012261A (zh) * 2019-11-18 2020-04-17 深圳市杉川机器人有限公司 基于场景识别的清扫方法、***、扫地设备及存储介质
CN111538330B (zh) 2020-04-09 2022-03-04 北京石头世纪科技股份有限公司 一种图像选取方法、自行走设备及计算机存储介质
CN111813103B (zh) * 2020-06-08 2021-07-16 珊口(深圳)智能科技有限公司 移动机器人的控制方法、控制***及存储介质
CN113959038B (zh) * 2021-10-08 2023-02-03 中科智控(南京)环境科技有限公司 一种自清洁杀菌过滤***

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050192707A1 (en) * 2004-02-27 2005-09-01 Samsung Electronics Co., Ltd. Dust detection method and apparatus for cleaning robot
CN102613944A (zh) * 2012-03-27 2012-08-01 复旦大学 清洁机器人脏物识别***及清洁方法
US20140257562A1 (en) * 2011-05-27 2014-09-11 Hon Hai Precision Industry Co., Ltd. Object searching system, object searching method, and cleaning robot
CN105380575A (zh) * 2015-12-11 2016-03-09 美的集团股份有限公司 扫地机器人的控制方法、***、云服务器和扫地机器人
CN106725127A (zh) * 2017-02-04 2017-05-31 北京小米移动软件有限公司 扫地机器人的清扫方法及装置
CN106998984A (zh) * 2014-12-16 2017-08-01 伊莱克斯公司 用于机器人清洁设备的清洁方法
KR20170108656A (ko) * 2016-03-18 2017-09-27 엘지전자 주식회사 이동 로봇 및 그 제어방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017038894A (ja) * 2015-08-23 2017-02-23 日本電産コパル株式会社 掃除ロボット
CN105395144B (zh) * 2015-12-21 2018-01-19 美的集团股份有限公司 扫地机器人的控制方法、***、云服务器和扫地机器人

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050192707A1 (en) * 2004-02-27 2005-09-01 Samsung Electronics Co., Ltd. Dust detection method and apparatus for cleaning robot
US20140257562A1 (en) * 2011-05-27 2014-09-11 Hon Hai Precision Industry Co., Ltd. Object searching system, object searching method, and cleaning robot
CN102613944A (zh) * 2012-03-27 2012-08-01 复旦大学 清洁机器人脏物识别***及清洁方法
CN106998984A (zh) * 2014-12-16 2017-08-01 伊莱克斯公司 用于机器人清洁设备的清洁方法
CN105380575A (zh) * 2015-12-11 2016-03-09 美的集团股份有限公司 扫地机器人的控制方法、***、云服务器和扫地机器人
KR20170108656A (ko) * 2016-03-18 2017-09-27 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN106725127A (zh) * 2017-02-04 2017-05-31 北京小米移动软件有限公司 扫地机器人的清扫方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022208869A1 (de) 2022-08-26 2024-02-29 BSH Hausgeräte GmbH Verfahren zum Erstellen einer Umgebungskarte

Also Published As

Publication number Publication date
CN108125622A (zh) 2018-06-08

Similar Documents

Publication Publication Date Title
WO2019114221A1 (zh) 控制方法、***及所适用的清洁机器人
US10293489B1 (en) Control method and system, and cleaning robot using the same
WO2019232806A1 (zh) 导航方法、导航***、移动控制***及移动机器人
US11846950B2 (en) Mobile robot and control method thereof
WO2019114219A1 (zh) 移动机器人及其控制方法和控制***
US10265858B2 (en) Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
US11226633B2 (en) Mobile robot and method of controlling the same
WO2021026831A1 (zh) 移动机器人及其控制方法和控制***
TWI687784B (zh) 移動式機器人及其控制方法
US10921806B2 (en) Moving robot
WO2020102946A1 (zh) 构建地图、定位、导航、控制方法及***、移动机器人
WO2019114220A1 (zh) 控制方法、***及所适用的移动机器人
US20200027336A1 (en) Moving robot and control method thereof
CN114603559A (zh) 移动机器人的控制方法、装置、移动机器人及存储介质
TW201940953A (zh) 拍攝方法、裝置、智慧型裝置及儲存媒體
JP2019515407A (ja) ロボットが学習した経路を自律的に走行するように初期化するためのシステムおよび方法
CN102930256A (zh) 人脸识别装置及人脸图像识别方法
WO2022252937A1 (zh) 清洁设备及用于清洁设备的光触发事件识别方法
CN106131413A (zh) 一种拍摄设备的控制方法及拍摄设备
TW201941103A (zh) 拍攝方法、裝置和智慧型裝置
CN106737724A (zh) 一种家庭社交服务型机器人***
US10437253B2 (en) Control method and system, and mobile robot using the same
JP7173846B2 (ja) 掃除機の制御システム、自律走行型掃除機、掃除システム、および掃除機の制御方法
TW201916669A (zh) 一種注視識別及互動方法與裝置
CN115988309A (zh) 拍摄方法及装置、机器人和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18887722

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/12/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18887722

Country of ref document: EP

Kind code of ref document: A1