WO2017073955A1 - 청소 로봇 및 그 제어방법 - Google Patents
청소 로봇 및 그 제어방법 Download PDFInfo
- Publication number
- WO2017073955A1 WO2017073955A1 PCT/KR2016/011823 KR2016011823W WO2017073955A1 WO 2017073955 A1 WO2017073955 A1 WO 2017073955A1 KR 2016011823 W KR2016011823 W KR 2016011823W WO 2017073955 A1 WO2017073955 A1 WO 2017073955A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cleaning
- cleaning robot
- user
- robot
- module
- Prior art date
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 1396
- 238000000034 method Methods 0.000 title claims abstract description 164
- 230000006870 function Effects 0.000 claims abstract description 19
- 238000004891 communication Methods 0.000 claims description 155
- 238000003860 storage Methods 0.000 claims description 77
- 239000000428 dust Substances 0.000 claims description 48
- 230000007613 environmental effect Effects 0.000 claims description 32
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 23
- 230000008093 supporting effect Effects 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 9
- 238000003032 molecular docking Methods 0.000 claims description 9
- 230000008878 coupling Effects 0.000 claims description 8
- 238000010168 coupling process Methods 0.000 claims description 8
- 238000005859 coupling reaction Methods 0.000 claims description 8
- 230000001939 inductive effect Effects 0.000 claims description 2
- 230000001276 controlling effect Effects 0.000 description 81
- 230000000875 corresponding effect Effects 0.000 description 61
- 238000010586 diagram Methods 0.000 description 60
- 230000033001 locomotion Effects 0.000 description 41
- 230000008569 process Effects 0.000 description 30
- 239000003570 air Substances 0.000 description 22
- 238000003384 imaging method Methods 0.000 description 22
- 238000005108 dry cleaning Methods 0.000 description 21
- 230000008859 change Effects 0.000 description 15
- 238000012545 processing Methods 0.000 description 15
- 230000004044 response Effects 0.000 description 14
- 230000002159 abnormal effect Effects 0.000 description 10
- 239000007789 gas Substances 0.000 description 9
- 230000001954 sterilising effect Effects 0.000 description 9
- 230000036760 body temperature Effects 0.000 description 8
- 230000010485 coping Effects 0.000 description 8
- 230000005855 radiation Effects 0.000 description 8
- 238000004659 sterilization and disinfection Methods 0.000 description 8
- 230000005856 abnormality Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 7
- 238000013020 steam cleaning Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 5
- 238000011109 contamination Methods 0.000 description 5
- 238000007791 dehumidification Methods 0.000 description 5
- 238000013507 mapping Methods 0.000 description 5
- 239000000126 substance Substances 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 238000007726 management method Methods 0.000 description 4
- 238000010408 sweeping Methods 0.000 description 4
- 208000024891 symptom Diseases 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000003749 cleanliness Effects 0.000 description 3
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 230000002996 emotional effect Effects 0.000 description 3
- 230000001965 increasing effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 229910052704 radon Inorganic materials 0.000 description 3
- SYUHGPGVQRZVTB-UHFFFAOYSA-N radon atom Chemical compound [Rn] SYUHGPGVQRZVTB-UHFFFAOYSA-N 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000004851 dishwashing Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000001976 improved effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- KWGKDLIKAYFUFQ-UHFFFAOYSA-M lithium chloride Chemical compound [Li+].[Cl-] KWGKDLIKAYFUFQ-UHFFFAOYSA-M 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000000284 resting effect Effects 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 239000012080 ambient air Substances 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012508 change request Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000249 desinfective effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003487 electrochemical reaction Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000011086 high cleaning Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 239000011261 inert gas Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012856 packing Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000013439 planning Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 208000020016 psychiatric disease Diseases 0.000 description 1
- WQGWDDDVZFFDIG-UHFFFAOYSA-N pyrogallol Chemical compound OC1=CC=CC(O)=C1O WQGWDDDVZFFDIG-UHFFFAOYSA-N 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001179 sorption measurement Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L7/00—Suction cleaners adapted for additional purposes; Tables with suction openings for cleaning purposes; Containers for cleaning articles by suction; Suction cleaners adapted to cleaning of brushes; Suction cleaners adapted to taking-up liquids
- A47L7/0085—Suction cleaners adapted for additional purposes; Tables with suction openings for cleaning purposes; Containers for cleaning articles by suction; Suction cleaners adapted to cleaning of brushes; Suction cleaners adapted to taking-up liquids adapted for special purposes not related to cleaning
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2894—Details related to signal transmission in suction cleaners
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/02—Floor surfacing or polishing machines
- A47L11/10—Floor surfacing or polishing machines motor-driven
- A47L11/14—Floor surfacing or polishing machines motor-driven with rotating tools
- A47L11/18—Floor surfacing or polishing machines motor-driven with rotating tools the tools being roll brushes
- A47L11/185—Floor surfacing or polishing machines motor-driven with rotating tools the tools being roll brushes with supply of cleaning agents
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/02—Floor surfacing or polishing machines
- A47L11/20—Floor surfacing or polishing machines combined with vacuum cleaning devices
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/02—Floor surfacing or polishing machines
- A47L11/20—Floor surfacing or polishing machines combined with vacuum cleaning devices
- A47L11/201—Floor surfacing or polishing machines combined with vacuum cleaning devices with supply of cleaning agents
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/02—Floor surfacing or polishing machines
- A47L11/20—Floor surfacing or polishing machines combined with vacuum cleaning devices
- A47L11/202—Floor surfacing or polishing machines combined with vacuum cleaning devices having separate drive for the cleaning brushes
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4036—Parts or details of the surface treating tools
- A47L11/4044—Vacuuming or pick-up tools; Squeegees
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2805—Parameters or conditions being sensed
- A47L9/281—Parameters or conditions being sensed the amount or condition of incoming dirt or dust
- A47L9/2815—Parameters or conditions being sensed the amount or condition of incoming dirt or dust using optical detectors
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2805—Parameters or conditions being sensed
- A47L9/2826—Parameters or conditions being sensed the condition of the floor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/04—Gripping heads and other end effectors with provision for the remote detachment or exchange of the head or parts thereof
- B25J15/0491—Gripping heads and other end effectors with provision for the remote detachment or exchange of the head or parts thereof comprising end-effector racks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/02—Docking stations; Docking operations
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45098—Vacuum cleaning robot
Definitions
- the present invention relates to a cleaning robot that provides human service interaction (HRI) technology and a method of manufacturing the same.
- HRI human service interaction
- the cleaning robot is a device that automatically cleans the cleaning space by driving foreign materials such as dust accumulated on the floor while driving the cleaning space without a user's operation. That is, the cleaning robot travels through the cleaning space and cleans the cleaning space.
- a cleaning robot includes: a module in which at least one module supporting different functions is integrated; And a controller configured to control an operation of the modular to control at least one of a device in the cleaning robot and an IoT apparatus.
- the modular includes a gripper module for controlling the robot arm to mount a cleaning head on the robot arm, the control unit controls the operation of the gripper module, by using a cleaning head mounted on the robot arm You can control the cleaning performance.
- the modular may further include a communication module configured to support connection with at least one other cleaning robot through a communication network, and the control unit may include a support specification supported by the at least one other cleaning robot, a size and shape of a cleaning area.
- the joint cleaning may be controlled by controlling an operation of the communication module based on at least one of the following.
- the modular includes a recognition module for detecting a user existing in the room, the control unit, based on at least one of the detection result through the recognition module, and the detection result through the IoT device, performing the cleaning Can be controlled.
- the modular includes a sensor module for acquiring indoor environment information
- the controller is based on the indoor environment information acquired through the sensor module and the desired information about the indoor environment set by the user, the cleaning robot And controlling at least one of the IoT apparatuses to adjust the indoor environment.
- the modular unit may include a communication module for supporting a connection with an external device through a communication network, and an image unit for acquiring image information.
- the controller may receive the image unit when receiving a remote connection request through the communication module. By controlling, image information may be obtained.
- the robot arm is a cleaning head is mounted; And a controller configured to determine a cleaning head corresponding to a cleaning mode among a plurality of cleaning heads, control mounting between the robot arm and the determined cleaning head, and control cleaning by using the mounted cleaning head. .
- the robot arm may be provided with a water supply pipe for supplying water to the cleaning head, and a suction passage for sucking dust.
- a docking portion for inducing coupling with the cleaning head, and an electromagnet for fixing the cleaning head may be provided.
- a cleaning head storage box in which the plurality of cleaning heads are stored; It may further include.
- the cleaning head storage box may be provided at a station of the cleaning robot or at a predetermined position.
- the cleaning head storage box the plurality of cleaning head may be stored in a predetermined position.
- the controller may identify the position of the cleaning head storage box using an infrared ray (IR) sensor mounted in the cleaning head storage box.
- IR infrared ray
- the communication module for performing a communication connection with at least one other cleaning robot;
- a determination unit that determines a common cleaning method with the at least one other cleaning robot based on at least one of a support specification supported by the at least one other cleaning robot, a size of a cleaning area, and a shape;
- a controller configured to control the cleaning robot and the at least one other cleaning robot based on the determined joint cleaning method.
- the storage robot may include a storage unit for storing map data regarding the area where the cleaning robot is located.
- the determination unit a group community cleaning method for cleaning the same area as the at least one other cleaning robot together, the area-specific community cleaning method for cleaning the divided at least one other cleaning robot and the cleaning area by zone, and the cluster It is possible to determine any one of the multi-pattern co-cleaning method in which the co-cleaning method and the zone-specific common cleaning method are mixed.
- the controller may control at least one of the operation of the at least one other cleaning robot and the cleaning robot using at least one of environmental information and state information included in the cleaning situation information received from the at least one other cleaning robot. Can be controlled.
- a cleaning robot includes: a communication unit configured to receive a cleaning method and a cleaning area from a cleaning robot that receives the transmitted support specification in response to transmitting the support specification of the cleaning robot to at least one other cleaning robot; And it may include a control unit for controlling the device in the cleaning robot based on the cleaning method and the cleaning area received from the cleaning robot.
- the communication unit may transmit environmental information of the cleaning area and state information of the cleaning robot.
- control unit a group common cleaning method for cleaning the same area as the at least one other cleaning robot together, a section for each common cleaning method for cleaning the divided at least one other cleaning robot and the cleaning area by zone, and the cluster Cleaning may be performed by controlling a device in the cleaning robot based on any one of a multi-pattern common cleaning method in which a common cleaning method and a zone-specific common cleaning method are mixed.
- the sensing unit for detecting a user based on at least one of the at least one IoT apparatus and a sensor module in the room; And if it is determined that the detected distance to the user is closer than a preset level, the cleaning control unit may control cleaning based on the detected state of the user.
- the cleaning controller may switch to the silent mode or the cleaning area based on the detected user's behavior.
- the cleaning control unit detects the user's behavior detected by the at least one IoT apparatus or the sensor unit, the location of the user, and the voice recognition unit.
- the cleaning operation may be controlled based on at least one of the user's voice and ambient noise received through the user.
- the cleaning controller may adjust the level of the silent mode according to the amount of ambient noise received through the voice recognition unit.
- the desired information setting unit for setting the desired information about the indoor environment;
- a generator configured to acquire indoor environment information and generate an indoor environment map;
- an indoor environment controller configured to compare the desired information about the indoor environment with the acquired indoor environment map, and adjust the indoor environment based on the comparison result.
- control unit for controlling the overall operation of the cleaning robot further comprising, a vehicle body controlled by the control unit is equipped with a sensor module for obtaining the indoor environment information; may further include a.
- the generation unit may acquire and combine indoor environment information from at least one of a sensor module mounted on a cleaning robot, a sensor module mounted on a vehicle, and an IoT apparatus, and map the combined indoor environment information to map information.
- An indoor environment map may be generated.
- the indoor environment controller may control at least one of the device inside the cleaning robot and the internet of things device to adjust the indoor environment to correspond to the desired information about the indoor environment.
- An indoor environment map may be generated by obtaining indoor environment information by controlling the operation of at least one of a sensor module mounted on the vehicle and an IoT apparatus.
- the indoor environment controller may request re-measurement if the difference between the indoor environment information obtained through the IoT apparatus and the indoor environment information obtained through the sensor module mounted on the cleaning robot exceeds a preset range. Can be.
- the image unit for obtaining the image information in response to the user's remote connection request;
- a communication unit for transmitting the obtained image information;
- a controller for controlling the device in the cleaning robot.
- the controller may control the device in the cleaning robot to stop the other task in operation and activate the image unit when the other task is being performed.
- the image unit may acquire image information regarding the ROI of the user or obtain the image information according to the remote control of the user.
- the image unit may acquire image information regarding a plurality of regions of interest according to a preset rank at a preset time.
- the vehicle may be further controlled by the control unit.
- the image unit may be mounted on at least one of one surface of the cleaning robot and one surface of a flying vehicle.
- the controller may control a device in the cleaning robot to continuously perform the stopped operation when the remote connection of the user is terminated.
- the controller may detect the position of the vehicle and display the position of the vehicle in map information based on the detection result.
- the voice recognition module for recognizing and recording the voice, and the point where the recognized voice is generated; Communication unit for transmitting the recorded voice; And a controller for controlling a device in the cleaning robot to transmit a preset voice or to connect to a preset contact.
- the voice recognition module may identify a point at which the recognized voice is generated by using a voice recognition unit arranged in an array form.
- the controller may control the device in the cleaning robot to move to the identified point to transmit a preset voice, to capture image information regarding the identified point, or to connect a telephone through a home network. .
- the recognition unit for recognizing the call signal of the user;
- a voice recognition module which receives a voice command from the user and derives a recognition result regarding the received voice command; And if it is determined that the identified user has the authority to use the recognition result, the controller may control a device in the cleaning robot based on the recognition result.
- the controller may include a security module that determines a user's right to use through at least one of the user's motion and the user's voice.
- the controller may control a device in the cleaning robot and at least one IoT apparatus connected through a communication network to provide a service corresponding to the recognition result.
- the voice recognition module for detecting the abnormal signs by receiving the user's voice;
- a sensor module for obtaining biometric information of the user;
- a controller configured to determine a degree of a user's state based on the acquired biometric information of the user, and to control a device in the cleaning robot according to a preset countermeasure based on the determined state of the user.
- the voice recognition module may identify a point at which the voice of the received user is generated through the voice recognition unit.
- the control unit may control a speaker according to the determined state of the user to transmit a preset voice to the user, or control the communication unit to connect to a preset contact.
- the controller may control an operation of at least one IoT apparatus connected through a home network according to the determined state of the user.
- the sensor module for obtaining the user's biometric information according to a predetermined period; And if it is determined that there is an abnormality by determining the degree of the user's state based on the acquired biometric information of the user, the device in the cleaning robot according to the preset countermeasures based on the determined degree of the user's state and the user's reaction. It may include a control unit for controlling.
- a cleaning robot includes: an image unit which acquires image information about an object by following an object selected by a user; And a controller configured to perform a process corresponding to a situation generated according to the movement of the detected object from the acquired image information.
- the imager may receive an object to be followed through an input unit or may select an object to be followed in conjunction with an external device through a communication unit, and may follow the selected object to obtain image information about the object. .
- the controller may detect contamination generated in accordance with the movement of the detected object, and when the contamination is detected, the controller may perform cleaning by controlling a device in the cleaning robot.
- the controller may control the communication unit to connect to a preset contact or transmit a message.
- controller may control a speaker to transmit a warning message to the object.
- the apparatus may further include a voice recognition module for recognizing the voice of the object.
- the controller may recognize the voice of the object through the voice recognition module and perform an operation corresponding to the recognition result.
- the controller may provide a route guidance to the destination based on the map information stored in the memory when it is determined that the route to the destination is requested as a result of the recognition through the voice recognition module.
- a control method of a cleaning robot includes controlling the mounting between the determined cleaning heads on a robot arm; Controlling the cleaning performance using the mounted cleaning head; It may include.
- controlling the mounting may include fixing the mounted cleaning head through a docking unit provided in the robot arm and an electromagnet provided in the robot arm; It may further include.
- the controlling of the mounting may include identifying a storage box in which the determined cleaning head is stored in a cleaning head storage box in which the plurality of cleaning heads are stored, and controlling mounting between the cleaning head and the robot arm stored in the identified storage box. doing; It may include.
- controlling the mounting may include identifying a position of the cleaning head storage box using an infrared sensor mounted to the cleaning head storage box; It may include.
- a control method of a cleaning robot includes: performing communication connection with at least one other cleaning robot; Determine a method of co-cleaning with the at least one other cleaning robot based on at least one of a support specification supported by the at least one other cleaning robot, a size of a cleaning area, and a shape; Controlling the cleaning robot and the at least one other cleaning robot based on the determined common cleaning method; It may include.
- the determining may include a group cavity cleaning method for cleaning the same area as the at least one other cleaning robot together, a section cleaning method for each area in which the at least one other cleaning robot and the cleaning area are divided and cleaned according to the area. It is possible to determine any one of a multi-pattern common cleaning method in which a group common cleaning method and a regional common cleaning method are mixed.
- the controlling may include at least one of the operation of the at least one other cleaning robot and the cleaning robot using at least one of environmental information and state information included in the cleaning situation information received from the at least one other cleaning robot. To control; It may further include.
- a method of controlling a cleaning robot includes: receiving a cleaning method and a cleaning area from a cleaning robot that receives the transmitted support specification in response to transmitting the support specification of the cleaning robot to at least one other cleaning robot; Controlling a device in the cleaning robot based on the cleaning method and the cleaning area received from the cleaning robot; It may include.
- the receiving may correspond to receiving the cleaning method and the cleaning area, and transmit environment information of the cleaning area and state information of the cleaning robot.
- the controlling may include: a group cavity cleaning method for cleaning the same area as the at least one other cleaning robot together, a section cleaning method for each area for cleaning at least one other cleaning robot and a cleaning area by area, and the Cleaning may be performed by controlling the devices in the cleaning robot based on any one of the multi-pattern cavities in which the group cavities and the sectional cavities are mixed.
- a control method of a cleaning robot includes: detecting a user based on at least one of at least one IoT apparatus and a sensor module present in a room; If it is determined that the distance to the detected user is closer than a preset level, controlling the cleaning based on the detected state of the user; It may include.
- the controlling may include switching to a silent mode or switching a cleaning area based on the detected user's behavior when it is determined that the distance to the detected user is closer than a preset level; It may further include.
- the controlling may include determining a user's behavior detected by the at least one IoT apparatus or the sensor unit, a location of the user, and a voice recognition unit when it is determined that the detected distance to the user is closer than a preset level. Controlling the cleaning operation based on at least one of the user's voice and ambient noise received through the user; It may further include.
- the controlling may include adjusting a level of the silent mode according to the amount of ambient noise received through the voice recognition unit; It may further include.
- Control method of the cleaning robot setting the desired information about the indoor environment; Acquire indoor environment information to generate an indoor environment map; Comparing the desired information about the indoor environment with the acquired indoor environment map, and adjusting an indoor environment based on the comparison result; It may include.
- the generating may be performed by combining indoor environmental information obtained from at least one of a sensor module mounted on a cleaning robot, a sensor module mounted on an aircraft, and an IoT apparatus, and mapping the combined indoor environmental information to map information. To generate an indoor environment map.
- the adjusting may control at least one of the device in the cleaning robot and the IoT apparatus to adjust the indoor environment to correspond to the desired information about the indoor environment.
- a method of controlling a cleaning robot includes: obtaining image information in response to a user's remote access request; Delivering the obtained image information; Controlling a device in the cleaning robot; It may include.
- the controlling may include controlling the device in the cleaning robot to stop the other work being performed according to the remote access request and activating the image unit when the other work is being performed; It may further include.
- the acquiring may include acquiring image information about the ROI of the user or acquiring the image information according to a remote control of the user.
- the acquiring further includes: controlling the device in the cleaning robot to perform the stopped operation when the remote connection of the user is terminated; It may further include.
- the control method of the cleaning robot according to one side, and delivers the recorded voice; Controlling a device in the cleaning robot to transmit a preset voice or to connect to a preset contact; It may include.
- the identifying may identify a point where the recognized voice is generated by using a speech recognition unit arranged in an array form.
- the controlling may include controlling a device in the cleaning robot to move to the identified point, to transmit a preset voice, to record image information regarding the identified point, or to connect a telephone through a home network. ; It may further include.
- Control method of the cleaning robot recognizes the user's call signal; Receiving a voice command from the user and deriving a recognition result regarding the received voice command; If it is determined that the identified user is authorized to use the recognition result, controlling the device in the cleaning robot based on the recognition result; It may include.
- the controlling may include determining a user's right to use through at least one of the user's motion and the user's voice; It may further include.
- the controlling may include controlling a device in the cleaning robot and at least one IoT apparatus connected through a communication network to provide a service corresponding to the recognition result; It may further include.
- Control method of the cleaning robot receives the user's voice to detect abnormal signs; Obtaining biometric information of the user; Determining a degree of a user's state based on the acquired biometric information of the user, and controlling a device in the cleaning robot according to a preset countermeasure based on the determined state of the user; It may include.
- the detecting may include identifying a point at which the voice of the received user is generated through a voice recognition unit; It may further include.
- the controlling may include controlling a speaker according to the determined state of the user to transmit a preset voice to the user, or controlling a communication unit to connect to a preset contact; It may include.
- a control method of a cleaning robot includes: obtaining biometric information of a user according to a preset period; If it is determined that there is an abnormality by determining the degree of the user's state on the basis of the obtained biometric information of the user, based on the determined state of the user and whether the user's reaction or not, control the device in the cleaning robot according to a predetermined countermeasure doing; It may include.
- a method of controlling a cleaning robot includes: obtaining image information about an object by following an object selected by a user; Performing a process corresponding to a situation generated according to the movement of the detected object from the acquired image information; It may include.
- the acquiring may include receiving an object to be followed through an input unit or selecting an object to be followed in conjunction with an external device through a communication unit, and obtaining image information about the object by following the selected object. have.
- the detecting may detect contamination generated in accordance with the movement of the detected object, and when the contamination is detected, the cleaning may be performed by controlling the device in the cleaning robot.
- the performing may control the communication unit to connect to a preset contact or transmit a message.
- Such a cleaning robot and its control method can provide various convenience services to the user as well as a cleaning service corresponding to a function of the cleaning robot.
- FIG. 1 is a view showing a modular system structure of a cleaning robot according to an embodiment.
- FIG. 2 is a view showing the inside / the exterior of the cleaning robot according to an embodiment.
- FIG 3 is a view illustrating an exterior of a cleaning robot to which a robot arm is attached according to an embodiment.
- 4 and 5 are control block diagrams of a cleaning robot for performing cleaning by mounting a cleaning head on a robot arm according to another embodiment.
- FIG. 6 is a view illustrating various types of cleaning heads mounted on the robot arm according to an exemplary embodiment.
- FIG. 7 is a view illustrating a cleaning head storage box in which the cleaning head is stored in each storage box according to an exemplary embodiment.
- FIG. 8 is a view for explaining that the cleaning head is mounted on the robot arm according to an embodiment.
- FIG. 9 is a cross-sectional view of a robot arm according to an embodiment.
- FIG. 10 is a flowchart illustrating an operation of a cleaning robot equipped with a cleaning head corresponding to a cleaning mode, according to an exemplary embodiment.
- FIG. 11 is a diagram illustrating a case where a cleaning storage box is provided in a charging station, according to an exemplary embodiment.
- 12A and 12B illustrate a plurality of cleaning robots according to different embodiments.
- FIG. 13A illustrates a control block diagram of a cleaning robot performing a collective cleaning according to an embodiment
- FIG. 13B illustrates a control block diagram of a plurality of cleaning robots performing a collective cleaning according to an embodiment. to be.
- FIG. 14 is a diagram illustrating a structure of a cavity cleaning area according to an exemplary embodiment.
- 15 is a flowchart illustrating an operation of a cleaning robot for performing a collective cleaning according to an embodiment.
- FIG. 16 is a diagram illustrating IoT devices directly connected through a home network or a communication network, according to another embodiment.
- FIG. 17 is a block diagram illustrating a cleaning robot for controlling a cleaning mode by detecting a user's position according to an exemplary embodiment.
- FIG. 18 is a diagram illustrating a room in which a cleaning robot and an IoT apparatus are installed according to an exemplary embodiment.
- 19 is a flowchart illustrating an operation of a cleaning robot for controlling a cleaning mode by detecting a location of a user according to an exemplary embodiment.
- 20 is a flowchart illustrating an operation of a cleaning robot that detects a user's position and controls a cleaning mode while performing cleaning according to an embodiment.
- 21 is a diagram for describing a case of adjusting a level of a silent mode by sensing ambient noise according to an exemplary embodiment.
- FIG. 22 is a diagram illustrating a case of changing a cleaning area by detecting a state of a user, according to an exemplary embodiment.
- FIG. 23 is a block diagram illustrating a cleaning robot for adjusting an indoor environment based on an environment map according to an embodiment.
- 24 is a diagram illustrating a vehicle to which a sensor module is attached according to an embodiment.
- FIG. 25 is a diagram for describing a case of collecting environmental information through a cleaning robot, a flying vehicle, and an IoT apparatus, according to an exemplary embodiment.
- FIG. 26 is a flowchart illustrating an operation of a cleaning robot that controls indoor environment by collecting indoor environment information according to an embodiment.
- FIG. 27 is a diagram for describing a case of adjusting an indoor environment by detecting a state of a user, according to an exemplary embodiment
- FIG. 28 is a block diagram illustrating a cleaning robot for obtaining and providing indoor image information, according to an exemplary embodiment.
- 29 is a diagram for describing a case of acquiring image information according to a limited field of view according to an exemplary embodiment.
- 34 is a diagram for describing a case of acquiring image information through an image unit implemented in a rod form according to an embodiment.
- 35 is a diagram for describing a case of obtaining image information through a vehicle according to an embodiment.
- 36 is a flowchart illustrating an operation of a cleaning robot for obtaining image information through a remote connection according to an exemplary embodiment.
- FIG. 37 is a diagram for describing a case of acquiring image information of a desired region through an image unit implemented in a bar shape according to an embodiment.
- FIG. 38 illustrates a control block diagram of a cleaning robot that senses a sound and performs a process corresponding thereto according to an embodiment.
- 39 is a flowchart illustrating an operation of a cleaning robot that detects a sound and performs a process corresponding thereto according to an embodiment.
- FIG. 40 is a diagram illustrating an operation of a cleaning robot in response to a sound detected at a front door according to an exemplary embodiment.
- FIG. 41 is a block diagram illustrating a cleaning robot for performing a process corresponding to a voice recognition result, according to an exemplary embodiment.
- FIG. 42 is a diagram illustrating a radiation pattern transmitted from an IoT apparatus according to an embodiment.
- 43 is a flowchart illustrating an operation of a cleaning robot operating through security authentication according to an embodiment.
- 44 is a diagram for describing a case of controlling a cleaning robot and an IoT apparatus by receiving a voice command of a user, according to an exemplary embodiment.
- 45 illustrates a control block diagram of a cleaning robot that determines a state of a user based on biometric information of the user, according to an exemplary embodiment.
- FIG. 46 illustrates a case in which a voice command of a user is input through a voice recognition unit arranged in an array according to an embodiment.
- 47 is a flowchart illustrating an operation of a cleaning robot that operates according to a user's state determined based on the user's biometric information, according to an exemplary embodiment.
- 48 is a flowchart illustrating an operation of a cleaning robot that determines a state of a user according to a set search time according to an embodiment.
- 49 and 50 are diagrams for describing a countermeasure through a home network when a user is in an emergency situation according to another embodiment.
- FIG. 51 is a block diagram illustrating a cleaning robot for providing a multimedia service according to an exemplary embodiment.
- FIG. 52 is a diagram for describing a case of obtaining image information by following a user through a vehicle according to an embodiment.
- 53 is a diagram for describing a case of detecting a possibility of occurrence of a risk by a user and processing a corresponding process according to an embodiment.
- FIG. 54 is a view through a beam projector according to an embodiment It is a figure for explaining the case where a kind of image is displayed.
- 55 is a diagram for describing a cleaning robot that provides a path to a specific area or a room, according to an exemplary embodiment.
- 56 is a flowchart illustrating an operation of a cleaning robot that acquires image information by following an object according to an exemplary embodiment.
- FIG. 57 is a flowchart illustrating an operation of a cleaning robot that provides a safety service in anticipation of a risk according to a movement of an object, according to an exemplary embodiment.
- FIG. 1 is a diagram illustrating a modular system mounted with an IoT apparatus and a cleaning robot according to an exemplary embodiment.
- the cleaning robot 1 refers to a device that automatically cleans the cleaning space by suctioning foreign substances such as dust accumulated on the floor while driving the cleaning space without a user's manipulation. That is, the cleaning robot 1 travels through the cleaning space and cleans the cleaning space.
- the cleaning robot 1 may provide not only a cleaning service but also various services.
- the cleaning robot 1 may not only provide a service for adjusting the indoor environment but also provide various services to a user existing in the indoor environment.
- the cleaning robot 1 may be a human service robot capable of various effects on the user and the indoor environment in which the user is located.
- the cleaning robot 1 can be extended to provide various services as well as cleaning services.
- a modular system is applied to the cleaning robot 1 so that various modules can be integrated.
- the modular means a device or system in which at least one module supporting different functions is integrated, or a device or system supporting a connection between at least one module supporting different functions.
- the cleaning robot 1 is a modular system is applied, by embedding a variety of modules, it is possible to overcome the existing limited functions.
- the cleaning robot 1 may basically include a middle level module (a), a battery module (b), a BMS module (c), and a moving module (d).
- the middle level module (a) refers to a module for planning and managing the overall operation of the cleaning robot (1).
- the middle level module (a) serves as a medium for interaction between the user and the hardware performing the direct operation of the cleaning robot (1).
- the intermediate level module a operates the cleaning robot 1 based on an input unit mounted on the cleaning robot 1 or various control commands received from the IoT apparatus 2. Can be set or reserved.
- the user may input various tasks to be performed by the cleaning robot 1 through various applications installed in the IoT apparatus 2. Then, the intermediate level module a may manage the operation of the cleaning robot 1 based on the input task. That is, the middle level module (a) serves to support the connection between the hardware of the user and the cleaning robot.
- the battery module (b) means a module to drive the cleaning robot.
- the battery module (b) mounted to the cleaning robot 1 may be implemented by various kinds of batteries known to those skilled in the art, there is no limitation.
- the battery module (b) is built in the cleaning robot 1, and serves to supply power to the necessary components.
- Battery module (b) is also referred to as a battery, hereinafter will be referred to as a battery for convenience.
- BMS (Battery Management System) module (c) refers to the module that manages the overall battery of the cleaning robot (1).
- the BMS module (c) not only performs charging / discharging of the battery but also performs an overall operation of managing battery consumption. As will be described later, if the remaining power of the battery is less than or equal to a predetermined level, the BMS module (c) may request to return to the charging station.
- the moving module (d) refers to a module for managing the overall movement of the cleaning robot (1).
- the moving module d may control the motor of the cleaning robot 1 to control an operation of the caster wheel or the like.
- the moving module (d) may control the movement path of the cleaning robot 1 or the like based on the detection result detected by the obstacle sensor to be described later.
- the cleaning robot 1 may be provided with a suction module (e).
- the suction module e may control an operation of sucking dust.
- the suction module e may be used to suck dust or the like during dry cleaning.
- the suction module e may control the suction operation of the dust by controlling the suction motor, the impeller, or the like.
- the cleaning robot 1 may be provided with a cyclone module f.
- the cyclone module f performs an operation of sucking only dust by centrifugal force among the air and dust sucked by the cleaning robot 1 and transferring the dust to the dust container.
- the dust accommodating module g isolates the dust sucked in the sealed space as the cleaning is performed in the dry cleaning mode.
- the dust accommodating module g refers to a module that controls the operation of accommodating the dust sucked from the cleaning head into the dust container during dry cleaning.
- connection cleaning tool (h) is various cleaning heads used for cleaning, and means detachable cleaning heads and the like.
- general cleaning tool (i) generally means a cleaning head or the like already mounted on the cleaning robot.
- the suction module (e), the cyclone module (f), the dust container module (g), the connection cleaning tool (h), and the general cleaning tool (i) described above may be used when performing the dry cleaning mode.
- the suction robot is provided with a suction module in addition to the intermediate level module (a), the battery module (b), the BMS module (c), and the moving module (d) that are basically built in the cleaning robot. (e), the cyclone module f, and the dust accommodating module g may be mounted.
- the cleaning robot is equipped with at least one of the connection cleaning tool h and the general cleaning tool i, so that dry cleaning can be performed.
- the water supply module (j) refers to a module that controls the supply of water to the cleaning head when wet cleaning or steam cleaning is performed.
- the wet cleaning tool k means various cleaning heads used for wet cleaning.
- the steam cleaning tool 1 means various cleaning heads used for steam cleaning.
- a wet cleaning tool k, m steam cleaning tool 1 may be built in.
- Air cleaning module (m) means a module for controlling the cleanliness of the indoor air, as will be described later.
- the humidification module n refers to a module that increases moisture in the air.
- Dehumidification module (o) means a module that cools the air to lower the humidity.
- the air cleaning module m, the humidification module n, and the dehumidification module o may provide a comfortable indoor environment to the user by adjusting the air cleanliness, humidity, and the like of the indoor environment.
- Air vehicle module (p) refers to a module that controls the overall operation of the aircraft.
- the cleaning robot 1 may be equipped with a vehicle such as a drone to acquire indoor environment information in three dimensions.
- the sensor module q refers to a module for collecting indoor environment information.
- the sensor module q may be implemented as various sensors capable of obtaining environmental information and a processor capable of controlling the overall operation of the aforementioned sensors.
- the cleaning robot 1 controls not only the cleaning robot 1 itself but also the operation of the IoT apparatus 2 based on the environmental information obtained through the sensor module q, thereby controlling the indoor environment. Can be adjusted.
- the actuator module r refers to a module that mechanically converts power to operate various machines. As will be described later, the actuator module r may be mounted inside a support for supporting a joint or an image part of the robot arm to operate the robot arm or the support.
- the actuator module may also be referred to as an actuator.
- the actuator module will be referred to as an actuator for convenience of description.
- the display module s means an apparatus for displaying various kinds of information to the user.
- the display module s includes all known devices capable of visually displaying various types of information such as a display panel and a beam projector, without limitation.
- the display module s may also be referred to as a display, hereinafter, referred to as a display for convenience of description.
- the recognition module t refers to a module capable of detecting a user's action.
- the action of the user includes not only the voice of the user but also various motions such as a gesture.
- the recognition module t may be implemented through a voice recognition module that detects a user's voice, a camera module that detects a user's motion, and the like.
- the recognition module t may be implemented through various devices capable of detecting a user's action, and the like.
- the gripper module u refers to a module for physically holding various objects.
- the cleaning robot 1 may be provided with a robot arm.
- the gripper module u may allow the cleaning head to be mounted to the robot arm via an electrical signal.
- the aforementioned modules can be integrated into the modular as needed.
- the modular may be implemented through a microcontrol unit (MCU), a processor, or the like, and may be integrated into a control unit that controls the overall operation of the cleaning robot 1 or may be separately embedded in the cleaning robot 1.
- MCU microcontrol unit
- the above-described modules may be integrated into or replace other components described below. Therefore, even if the terms and the components to be described later do not coincide with the same function, it corresponds to the included or corresponding configuration.
- At least one of the above-described modules may be integrated in a system on chip (SOC).
- SOC system on chip
- the present invention is not limited to only one system on chip.
- FIG. 2 is a view showing the appearance of the cleaning robot according to different embodiments.
- the cleaning robot 1a includes a main body 10 forming an appearance, a cover 20 covering an upper portion of the main body 10, and a cleaning space present in the cleaning space.
- the main body 10 forms the exterior of the cleaning robot 1a and supports various components installed therein.
- Cleaning head mounting portion 30 is installed in the suction port 11 formed in the lower portion of the main body 10 to improve the suction efficiency of the dust to sweep or scatter dust on the floor.
- the cleaning head mounting portion 30 is installed at the suction port 11 to a length corresponding to the suction port 11, and the drum-shaped brush unit 31 which rotates in a roller manner with respect to the floor surface to sweep or scatter dust on the floor. And a brush motor 32 for rotating the brush unit 31.
- the unit mounted to the cleaning head mounting unit 30 is not limited to the brush unit 31.
- various cleaning heads may be mounted in the cleaning head mounting unit 30 according to the cleaning mode.
- a caster wheel 60 which rotation angle changes in accordance with the state of the bottom surface on which the cleaning robot 1a moves is provided.
- the caster wheel 60 supports the cleaning robot 1a by being utilized in stabilization and fall prevention of the cleaning robot 1a, and is composed of a roller or caster-shaped wheel.
- the cleaning head may be mounted at various positions.
- a cleaning robot equipped with a cleaning head on the robot arm will be described.
- FIG. 3 is a view showing the appearance of the cleaning robot is attached to the robot arm according to an embodiment
- Figures 4 and 5 is a cleaning robot to perform the cleaning by mounting the cleaning head on the robot arm according to different embodiments
- a control block diagram is shown.
- 6 is a view illustrating various types of cleaning heads mounted on the robot arm according to an embodiment
- FIG. 7 is a view illustrating a cleaning head storage box in which the cleaning heads are stored in each storage box according to an embodiment. to be. Descriptions should be given together to prevent duplication.
- the robot arm 30 may be provided in the cleaning robot 1b.
- the cleaning robot 1b may be equipped with a cleaning head on the robot arm 30 to perform cleaning.
- the robot arm 3 is provided with a joint to move up / down and left / right.
- Robot arm 30 may be implemented with a variety of known materials, there is no limitation.
- a coupling terminal may be provided on an outer surface of the robot arm 30. The cleaning robot 1b may mount the cleaning head through the coupling terminal and perform cleaning through the attached cleaning head.
- the cleaning robot 1b can perform cleaning more conveniently by selecting the necessary cleaning head according to the cleaning mode and mounting the selected cleaning head.
- the configuration of the cleaning robot 1b on which the robot arm 3 is mounted will be described in more detail.
- the cleaning robot 1b includes the robot arm 30, the water supply module 31, the dust storage module 33, the suction motor 35, the mounting module 36, and the controller 37. , Battery 50, cleaning head storage box 40, obstacle sensor 41, display 42, and input unit 43.
- the water supply module 31, the dust storage module 33, the obstacle sensor 41, the mounting module 360, and the controller 37 may include a system on chip built in the cleaning robot 1. SOC) and can be operated by a processor. However, since there is not only one system on chip embedded in the cleaning robot 1b, but a plurality of system on chips may exist, the present invention is not limited to only one system on chip.
- the water supply module 31 refers to a module that controls the supply of water to the cleaning head when wet cleaning is performed.
- the water supply module 31 controls supply of water to the cleaning head through a water supply pipe embedded in the robot arm 30.
- the dust storage module 33 refers to a module for controlling an operation of storing dust sucked from the cleaning head into a dust container during dry cleaning.
- the dust storage module 33 transfers the dust sucked from the cleaning head to the dust container through the suction flow path provided therein in the robot arm 30. At this time, the dust storage module 33 sucks dust into the cleaning head using the suction motor 35.
- the battery 50 supplies power to the cleaning robot 1b.
- the battery 50 may be implemented in various forms such as external or internal, and there is no limitation.
- the obstacle sensor 41 means a sensor for detecting the obstacle.
- the obstacle sensor 41 may radiate an ultrasonic signal and detect an obstacle by receiving an ultrasonic signal reflected from the obstacle.
- the obstacle sensor 41 may detect an obstacle through various known methods, but is not limited thereto.
- the cleaning robot 1b may be provided with a display 42.
- the display 42 may be located in a central area on the upper surface of the cleaning robot 1b, but may be provided in various locations.
- the display 42 may be implemented as a liquid crystal display (LCD), a light emitting diode (LED), a plasma display panel (PDP), an organic light emitting diode (OLED), a cathode ray tube (CRT), or the like. May, but is not limited to.
- the display 42 may be implemented through a device capable of visually displaying various types of information in addition to the above-described components, and there is no limitation.
- the input unit 43 receives various control commands related to the operation of the cleaning robot 1b.
- the input unit 43 may be provided on the upper surface of the cleaning robot 1b.
- the display 42 may perform a function of the input unit 43.
- the robot arm 30 may be provided in the cleaning robot 1b.
- the robot arm 30 may be mounted on one surface of the cleaning robot 1b.
- the robot arm 30 may be mounted on the side of the cleaning robot 1b, but is not limited thereto and may be mounted anywhere on the cleaning robot 1b. .
- a motor may be provided inside the robot arm 30.
- Each joint of the robot arm 30 moves through a motor.
- the robot arm 30 is provided with two joints, as shown in Figure 3, can be freely manipulated up / down, left / right due to the motor for driving each joint.
- the controller 37 may control the movement of the robot arm 30 by controlling the driving of the motor.
- the robot arm 30 is provided with one or more passages, it can be connected to the cleaning head.
- the suction arm u1 and the water supply pipe u2 may be provided in the robot arm 30 as shown in FIG. 9.
- the water supply pipe u2 may be provided from the bucket containing the water to the end of the robot arm 30 on which the cleaning head is mounted.
- the suction passage u1 may be provided from the dust container to the end of the robot arm 30 on which the cleaning head is mounted.
- a docking unit u4 may be provided at an end of the robot arm 30 on which the cleaning head is mounted.
- the docking portion u4 is docked with a groove provided in the cleaning head, as shown in FIG. 9.
- an electrode u3 may be provided at the end of the docking portion u4. Accordingly, power and various control signals may be transmitted to the cleaning head through the docking unit u4.
- the cleaning head and the robot arm 30 are coupled through the docking unit u4, but the cleaning head may be separated as the cleaning robot 1b moves as the cleaning head 1b moves.
- the robot arm 30 may be provided with a coupling terminal u5 on the surface on which the cleaning head is mounted.
- the coupling terminal u5 serves to fix the cleaning head and the robot arm 30 to prevent the cleaning head from being separated.
- the coupling terminal (u5) may be implemented through an electromagnet, but is not limited to one embodiment, it may be implemented through a variety of materials that can secure the cleaning head and the robot arm (30).
- a rubber packing is provided at the end of the coupling terminal u5 to prevent dust or water from being discharged to the outside.
- the cleaning robot 1b may be provided with a cleaning head storage box 40.
- the cleaning head storage box 40 may be mounted to the cleaning robot 1b or may be provided outside the cleaning robot 1b as shown in FIG. 8. Detailed description thereof will be described later.
- the cleaning head storage box 40 includes a dry cleaning head h1 used for dry cleaning, a water rag cleaning head h2 used for wet cleaning, a sterilization cleaning head h3 used for sterilizing cleaning, A corner cleaning head h4 used for corner cleaning, a bedding cleaning head h5 used for bedding cleaning, and a window frame cleaning head h6 used for cleaning window frames may be stored.
- the cleaning head storage box 40 is provided with a plurality of storage compartments. At this time, the cleaning head storage box 40 may be determined in advance cleaning heads stored in each storage box. For example, referring to FIG. 6, a dry cleaning head h1 is disposed in the first storage box s1 of the cleaning head storage box 40, and a water trap cleaning head h2 and a third storage box are installed in the second storage box s2. s3) the sterilization cleaning head (h3), the fourth storage box (s4) the corner cleaning head (h4), the fifth storage box (s5) the bedding cleaning head (h6), the sixth storage box (s6) the window frame cleaning head (h6) ) Can be stored separately.
- the cleaning head storage box 40 may store cleaning heads designated in each storage box according to a preset order.
- the control unit 37 is based on the information stored in the memory. By identifying the storage head of the cleaning head of the, it can be controlled so that the cleaning head required for cleaning is attached to the robot arm.
- each cleaning head is different in shape and size, the shape or size of each storage box may be designed differently.
- the cleaning head may be stored so that the surface coupled to the robot arm 30 faces the upper surface of the storage box. Accordingly, the control unit 37 can be easily manipulated so that the cleaning head is mounted on the robot arm 30 without the need to separately operate the cleaning head.
- the controller 37 may control the overall operation of the cleaning robot 1b. Specifically, the control unit 37 may control the operation of all the components of the cleaning robot 1b, such as the robot arm 30, the display 42, as well as various modules built in the cleaning robot 1b.
- the controller 37 may be implemented through a processing device that performs various operations and control processes, such as a processor built in the cleaning robot 1b, and may be implemented through various known processing devices. .
- the controller 37 may generate a control signal for controlling the components of the cleaning robot 1b to control the operations of the above-described components. For example, as described above, the controller 37 may transmit power and a control signal through an electrode provided at the end of the docking unit u4 to control the operation of the cleaning head.
- the control unit 37 includes a cleaning mode setting unit 38 and a mounting module 36.
- the cleaning mode setting unit 38 may set a cleaning mode.
- the cleaning mode means a mode related to a method in which the cleaning robot performs cleaning.
- the cleaning mode includes a dry cleaning mode, a wet cleaning mode, a sterile cleaning mode.
- the cleaning mode includes a hard floor cleaning mode for cleaning a general floor according to the type of floor, soft floor cleaning mode for cleaning the floor, such as carpet.
- the cleaning mode includes a general cleaning mode for cleaning the open area according to the cleaning area, and a corner cleaning mode for cleaning the corner.
- the cleaning head used for cleaning is not only different, but water supply, cleaning frequency, cleaning strength, and the like are set differently, and whether the suction motor 35 is operated and the suction motor 35 is rotated. The number is different.
- the cleaning mode setting unit 38 sets the cleaning mode of the cleaning robot based on the information about the cleaning mode received from the user through the display 42, the input unit 43, the remote controller, or the user terminal. Can be.
- the cleaning mode setting unit 38 may control various settings of the device in the cleaning robot 1b for driving in the cleaning mode and the cleaning mode corresponding to the received information about the cleaning mode.
- the setting information for each cleaning mode is stored in the memory. Accordingly, the cleaning mode setting unit may set the operation of the components of the cleaning robot 1b according to the cleaning mode received from the user based on the setting information for each cleaning mode stored in the memory.
- the controller 37 may control the mounting of the cleaning head corresponding to the cleaning mode among the plurality of cleaning heads stored in the cleaning head storage box. For example, in the case of cleaning the dry open area, the control unit 37 may control the dry cleaning head h1 to be mounted on the robot arm 30 in the cleaning storage box 40 using the mounting module 36. Can be.
- the mounting module 36 may correspond to the gripper module u in FIG. 1. That is, the mounting module 36 may perform an operation for physically mounting the cleaning head to the robot arm 30.
- the control unit 37 may control the necessary cleaning head to be mounted on the robot arm 30 according to each cleaning mode using the mounting module 36.
- the controller 37 may perform cleaning using a cleaning head mounted on the robot arm 30.
- the above-described cyclone module f or the like is integrated in the control unit 37 to perform dry cleaning.
- the controller 37 may control the operation of the water supply module 31 to perform wet cleaning using a cleaning head mounted on the robot arm 30.
- FIG. 10 is a flowchart illustrating an operation of a cleaning robot equipped with a cleaning head corresponding to a cleaning mode, according to an exemplary embodiment.
- the cleaning robot may set a cleaning mode (900).
- the cleaning robot may receive setting information regarding the cleaning mode from the user through an input device, a remote controller, or a user terminal provided in the display or the cleaning robot. Accordingly, the cleaning robot can set the cleaning mode based on the received setting information.
- the cleaning robot may determine a cleaning method, a cleaning head, and the like corresponding to the set cleaning mode in step 901. Accordingly, the cleaning robot may determine whether a cleaning head suitable for the cleaning mode is currently mounted on the robot arm (902).
- the cleaning robot may perform cleaning using the mounted cleaning head (903). However, if the cleaning head suitable for the cleaning mode is not installed, the cleaning robot may identify the cleaning head suitable for the cleaning mode among the cleaning heads stored in the cleaning head storage box and install the identified cleaning head on the robot arm (904). ).
- Cleaning head storage box means a case for storing the cleaning head mounted on the cleaning robot.
- the cleaning head storage box may be implemented in a divided form of a plurality of storage boxes. Each bin holds a cleaning head.
- the storage box in which the cleaning head is stored may be set in advance. That is, in the memory of the cleaning robot, information about which cleaning head is stored in each storage box is stored. Accordingly, the cleaning robot can identify the cleaning head corresponding to the cleaning mode set from the cleaning head storage box described above.
- the cleaning head storage box may be mounted on the cleaning robot.
- the cleaning head storage box may be designed to be mounted on an upper surface of the cleaning robot. Accordingly, the cleaning robot can grasp the position of the cleaning head stored in the cleaning head storage box through the information stored in the memory as described above, and perform cleaning while replacing the cleaning head according to the cleaning mode. Therefore, the cleaning robot according to the disclosed embodiment can continuously perform various cleaning methods.
- the cleaning head storage box may be mounted on another device.
- the cleaning head storage box can be designed to be mounted to a charging station.
- the cleaning head storage box 40 may be mounted on the main body of the charging station 51, but is not limited thereto.
- the location information of the cleaning head storage box mounted on the station is stored in the memory of the cleaning robot. Accordingly, the cleaning robot can move to the filling station to mount the necessary cleaning head in the cleaning head storage box mounted on the filling station.
- the cleaning head storage box may be provided at a preset position.
- the user can place the cleaning head storage box in the desired location.
- the user may input a position of the cleaning head storage box through the input unit of the cleaning robot.
- the cleaning robot can identify the position of the cleaning head storage box through the 3D sensor and go to the cleaning head storage box based on this.
- an infrared ray (IR) sensor may be provided in the cleaning head storage box. Accordingly, the cleaning robot can determine the position of the cleaning head storage box based on the guidance signal emitted through the infrared sensor.
- the cleaning robot according to the disclosed embodiment can be more efficiently performed by easily replacing the cleaning head required by various cleaning modes using the robot arm.
- the cleaning robot in order to perform the cleaning, the cleaning robot requires various settings such as a cleaning area to be cleaned and a cleaning method for the cleaning area in addition to the cleaning mode. Therefore, hereinafter, the cleaning robot performing cleaning according to the cleaning robot existing in the vicinity and the common cleaning method based on the above-described setting will be described.
- FIG. 12A and 12B illustrate a plurality of cleaning robots according to different embodiments
- FIG. 13A illustrates a control block diagram of a cleaning robot performing joint cleaning according to an embodiment
- FIG. 14 is a diagram illustrating a control block diagram of a plurality of cleaning robots performing joint cleaning
- FIG. 14 is a diagram illustrating a structure of a cleaning area according to one embodiment. In the following description, together with the description to avoid overlapping.
- the cleaning robot 1c includes a display 42, an input unit 43, a communication unit 52, a SLAM module 53, a determination unit 54, and a control unit 55.
- the communication unit 52, the SLAM module 53, the determination unit 54, and the control unit 55 may be integrated in a system-on-chip embedded in the cleaning robot 1c, and may be integrated into a processor. Can be operated by.
- the present invention is not limited to only one system on chip.
- SLAM Simultaneous Localization And Mapping, SLAM
- the SLAM module 53 refers to a module for generating map information about the indoor environment.
- the SLAM module 53 is composed of a sensor such as a microcontrol unit (MCU) that can perform arithmetic operation to generate a map information on the indoor environment by collecting the collected ambient information and the collected ambient information when driving. Can be. That is, the SLAM module 53 collects surrounding information when driving through a sensor attached to the cleaning robot 1c, and collects the generated information to generate map information about the indoor environment.
- MCU microcontrol unit
- the communication unit 52 may exchange various data with an external device through a wireless communication network or a wired communication network.
- the wireless communication network means a communication network capable of transmitting and receiving a signal containing data wirelessly.
- the communication unit 52 may transmit and receive a wireless signal between devices through a base station through a communication method such as 3G (3 Generation), 4G (4 Generation), and the like, and in addition, wireless LAN and Wi-Fi. ), Bluetooth, Zigbee, WFD (Wi-Fi Direct), UWB (Ultra wideband), Infrared Data Association (IrDA), BLE (Bluetooth Low Energy), NFC (Near Field Communication)
- a wireless signal including data and a terminal within a predetermined distance may be transmitted and received.
- wired communication network means a communication network that can send and receive signals containing data by wire.
- wired communication networks include, but are not limited to, Peripheral Component Interconnect (PCI), PCI-express, Universe Serial Bus (USB), and the like.
- PCI Peripheral Component Interconnect
- USB Universe Serial Bus
- the communication unit 52 may connect with at least one other cleaning robot existing around the cleaning robot through a communication network.
- the communication unit 52 may perform connection with at least one other cleaning robot through direct communication or with at least one other cleaning robot through an external server.
- the external server may be outdoors or indoors.
- the external server may be a central server that manages a home network existing indoors, but is not limited thereto.
- any one of the IoT devices present in the room may serve as a gateway server of a home network or may be separately provided. Detailed description thereof will be described later.
- the at least one other cleaning robot present around the cleaning robot has a built-in communication unit regardless of the support specification or form, so that communication with an external device is sufficient, and is not limited to the same support specification or form.
- the cleaning robots 1c, 1d, and 1e performing the common cleaning method may be cleaning robots having the same support specifications and shapes as each other, as shown in FIG. 12A, and as shown in FIG. 12B, It may be a cleaning robot with different support specifications and shapes, and it is sufficient if it is connected through a communication network. Accordingly, the shape and support specification of the cleaning robot described below are not limited to those shown in the drawings.
- the communication unit 52 may transmit / receive wired / wireless signals including various data from at least one other cleaning robot connected thereto.
- the communication unit 52 may transmit / receive identification information of each cleaning robot.
- the identification information is information for identifying the cleaning robot, and may include not only the unique number and product number of the cleaning robot, but also an IP (Internet Protocol) address and a MAC address, but are not limited thereto. .
- a universally unique identifier may be used as identification information.
- the universal unique identifier may be converted according to a variable length character encoding scheme for Unicode such as UTF-8 and used as identification information.
- the communication unit 52 may identify the cleaning robot using the identification information, and connect with the cleaning robot identified through the Bluetooth pairing.
- the communication unit 52 may transmit / receive support specification information supported by each cleaning robot.
- the support specification information means information about a cleaning mode supported by the cleaning robot.
- the support specification information means information about the spec of each cleaning robot.
- the cleaning robot 1c may identify the support specifications supported by the cleaning robots, determine a common cleaning method, and transmit control commands based on the cleaning specifications, thereby performing the joint cleaning with the surrounding cleaning robot. Can be. Detailed description thereof will be described later.
- the communication unit 52 may transmit / receive map information stored in each cleaning robot.
- the map information means a map of the area where the cleaning robot has performed the cleaning.
- the SLAM module 53 may generate map information by collecting area information on which cleaning is performed through a 3D sensor or a camera.
- the cleaning robot 1c according to the disclosed embodiment shares the map information stored in each cleaning robot through the communication unit 52 among the cleaning robots performing the common cleaning, thereby performing the common cleaning using the latest map. You can also update map information. Detailed description thereof will be described later.
- the communication unit 52 may transmit / receive environmental information of each area with other cleaning robots during the common cleaning. Accordingly, the cleaning robot 1c according to the disclosed embodiment may perform more efficient cleaning by continuously correcting the cleaning method through environmental information.
- the communication unit 52 may operate as a master or a slave in connection with a communication unit built in another cleaning robot.
- the master means a role of instructing a result of the joint cleaning method by driving a connection between the plurality of cleaning robots performing the joint cleaning.
- the slave means a role according to the common cleaning method set by the master.
- the cleaning robot that leads the connection through the communication unit may be set as a master, and at least one other cleaning robot to be connected may be set as a slave.
- the method of setting the master and the slave is not limited to the above-described embodiment, and the master and the slave may be set between the cleaning robots through various known methods.
- a cleaning robot serving as a master will be referred to as a master cleaning robot
- a cleaning robot serving as a slave will be referred to as a slave cleaning robot.
- the determination unit 54 may determine various settings necessary for the cleaning robot to perform the common cleaning. For example, the determination unit 54 may determine a cleaning area, a cleaning method, a cleaning mode, and the like in which the cleaning robot is to be cleaned.
- the determination unit 54 may determine the cleaning method and the cleaning mode based on the user's control command received through the display 42, the input unit 43, the user terminal, the remote control, and the like described above.
- the memory also stores programs that assist the user in determining various settings related to cleaning. Accordingly, the determination unit 54 may determine various settings necessary to perform the cleaning by reflecting the various settings related to the cleaning determined by the user.
- the determination unit 54 may determine a common cleaning method with at least one other cleaning robot.
- the common cleaning method means a method in which a plurality of cleaning robots are jointly cleaned.
- the common cleaning method includes a group common cleaning method for cleaning the same area by different methods, a community cleaning method for each area by dividing and cleaning the cleaning area, and a mixed common cleaning method for mixing and cleaning the aforementioned methods.
- Support specifications may be different among a plurality of cleaning robots.
- the first cleaning robot may support only the dry cleaning mode
- the second cleaning robot may support only the wet cleaning mode
- the cleaning robot may support only the sterilization cleaning mode.
- the determination unit 54 may determine that the cleaning robot 1c, the first cleaning robot, and the second cleaning robot clean the same area together according to the group cavity cleaning method for the cleaning area.
- the determination unit 54 in the area where the wet cleaning is completed
- the cleaning method can be determined so that the cleaning robot performs sterilization cleaning. That is, the determination unit 54 may determine the joint cleaning method in consideration of the support specifications between the cleaning robots connected through the communication unit 52.
- the support specification may be the same between the plurality of cleaning robots.
- the cleaning robot, the first cleaning robot, and the second cleaning robot may all support dry, wet, sterilization, and cleaning modes.
- the determination unit 54 may divide the cleaning area to be handled for each cleaning robot, and determine that each cleaning robot performs cleaning through dry, wet, and sterilization cleaning modes for each divided area.
- the determination unit 54 may divide the cleaning area through various methods. For example, the determination unit 54 may divide the size of the cleaning area in proportion to the cleaning performance speed of the cleaning robot. As another example, when the cleaning target area is divided into a plurality of rooms, the determination unit 54 may determine to clean the cleaning robot for each room. At this time, the determination unit 54 may determine that the cleaning robot with a high cleaning performance speed assumes a larger room. Alternatively, the determination unit 54 may determine the cleaning area of each of the cleaning robots by calculating the efficiency through various methods, such as determining the cleaning area so that the moving line is short based on the position information between the cleaning robots.
- the determination unit 54 may determine to clean a part of the cleaning area and to clean the other part for each cleaning robot. As another example, the determination unit 54 may determine that a plurality of cleaning robots jointly clean the entire cleaning area as described above, but there is no limitation.
- the room may be divided into three regions R1, R2, and R3.
- the cleaning robot 1c may perform cleaning by receiving a cleaning execution command for the three regions R1, R2, and R3 from the user through the above-described input unit, a remote controller, a user terminal, and the like.
- the cleaning robot 1c may perform cleaning on the entire indoor areas R1, R2, and R3 at a time preset by the user.
- the cleaning robot 1c may perform cleaning on three areas by itself, but may also perform a common cleaning between other cleaning robots present around the cleaning robot 1c.
- a common cleaning not only can the cleaning efficiency be improved, but also when another cleaning robot needs to perform cleaning that is not supported by the cleaning robot, another cleaning robot can perform this instead.
- the determination unit 54 divides the area to be cleaned according to the area-specific common cleaning method, decides to perform individual cleaning for each divided area, or co-cleans the entire area according to the group common cleaning method, or the mixed cavity There are no limitations, including the ability to decide on how to clean individual zones and community cleanups.
- Information about the method of determining the common cleaning method and the order of the cleaning modes performed in the common cleaning method may be preset and stored in the memory.
- the controller 55 may be implemented through a processing device that performs various operations and control processes, such as a processor built in the cleaning robot 1, and may be implemented through various known processing devices. ,
- the controller 55 may control the overall operation of the cleaning robot 1.
- the controller 140 may control the operation of all components of the cleaning robot 1 such as the display 42 as well as various modules built in the cleaning robot 1.
- the controller 55 may generate a control signal for controlling the components of the cleaning robot 1 to control the operations of the above-described components.
- the controller 55 controls the communication unit 52 to transmit the determination result regarding the common cleaning method to the at least one other cleaning robot through the control signal, so that the common cleaning is performed.
- the controller 55 corrects various settings related to the cleaning operation based on the environmental information received through the communication unit 52, thereby allowing the cleaning to be performed more efficiently.
- the environmental information is sensed through the sensor and means various information related to the environment of the cleaning area.
- the environmental information includes, but is not limited to, the amount of floor dust in the cleaning area, the floor humidity in the cleaning area, and the degree of sterilization of the cleaning area.
- the controller 55 transmits to another cleaning robot to perform dry cleaning once more for the specific area or It can be controlled to perform dry cleaning on a specific area once more.
- control unit 55 has a built-in suction module (e), cyclone module (f), and the dust storage module (g) in Figure 1, it is possible to perform dry cleaning.
- the controller 55 may include a water supply module j in FIG. 1 to perform wet cleaning or steam cleaning. That is, the controller 55 may integrate various modules for performing cleaning, and there is no limitation.
- control units built in the cleaning robot according to another embodiment to be described later are the same, and there is no limitation.
- the cleaning robot 1c may monitor the entire cleaning area by using the environmental information received from the slave cleaning robot, so that the cleaning may be efficiently performed by reflecting the entire situation.
- the cleaning robot 1c decides among the individual cleaning, group community cleaning, and mixed cavity cleaning methods for each area so that the cleaning efficiency is increased among the plurality of cleaning robots, and the cleaning robots simultaneously perform cleaning. Can be controlled to run.
- the controller 55 may receive situation information from another cleaning robot through the communication unit 52, and control the operation of the other cleaning robot based on the situation information.
- the cleaning status information includes environmental information, cleaning status information, and the like of the cleaning target area, and the cleaning status information includes information regarding whether the cleaning is being completed or completed or an error has occurred.
- the cleaning robot 1c may monitor the entire situation of the cleaning area through the situation information, and may control at least one of the cleaning robot and another cleaning robot based on the situation.
- the controller 55 may store the date and the connection method of the cleaning robot in the memory. Accordingly, the controller 55 may easily connect with another cleaning robot by using the information stored in the memory. For example, when paired with another cleaning robot via a Bluetooth communication network, the controller 55 may store the connection information in a memory so that it can be automatically connected when the pairing is performed later.
- controller 55 may control the communication unit 52 to update the map information stored in the memory or transfer the map information to another cleaning robot to share the latest map information.
- the cleaning robot 1c may be provided with a memory 59.
- the memory 59 may be provided with an algorithm or a program for controlling various settings required for the operation of the cleaning robot.
- the map 59 may store map information as described above, and other cleaning robots and connection information may be stored.
- the memory 59 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM ( Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic Disk It may be implemented through at least one type of storage medium among optical discs. However, the present invention is not limited thereto and may be implemented in any other form known in the art.
- the cleaning robot may not only perform a role as a master but also as a slave.
- 13B shows a control block diagram of the master cleaning robot and the slave cleaning robot.
- each component is the same as the above-described components, but the slave cleaning robot (1d, 1e) can receive the common cleaning method determined by the master cleaning robot through the communication unit 58, and can perform cleaning based on this. have. In other words, the determination process regarding the common cleaning method is handled by the master cleaning robot.
- the cleaning module 57a of the slave cleaning robot 1d the suction module e, the cyclone module f, the dust storage module g, and the like are integrated in FIG. have.
- the sterilization module 57b of the slave cleaning robot 1e is integrated with the water supply module j in FIG. 1, and may perform at least one of wet cleaning and steam cleaning.
- the slave cleaning robots 1d and 1e transmit the environmental information acquired through the sensor to the master cleaning robot 1c so that the master cleaning robot 1c can monitor the entire cleaning area even if the area to be cleaned is divided. Make sure
- 15 is a flowchart illustrating an operation of a cleaning robot for performing a collective cleaning according to an embodiment.
- the cleaning robot may set a cleaning mode (900).
- the cleaning robot may receive setting information regarding the cleaning mode from the user through an input device, a remote controller, or a user terminal provided in the display or the cleaning robot. Accordingly, the cleaning robot can set the cleaning mode based on the received setting information.
- the cleaning robot if the cleaning method is input from the user, or if another cleaning robot is detected in the vicinity, or if there is a connection history with another cleaning robot in the memory, the cleaning robot is in the cleaning mode according to the cleaning method. Can be set.
- the cleaning robot may perform a connection with at least one other cleaning robot (1410). As described above, the cleaning robot may be connected to at least one other cleaning robot through direct communication or through a gateway server of a home network.
- the sweeping robot can determine how to clean the cavity with other connected sweeping robots.
- the cleaning robot may determine the common cleaning method by combining at least one of the size and shape of the cleaning area, and the support specifications of the cleaning robots as described above.
- the cleaning robot may determine any one of the zone-specific cavity cleaning method, the group cavity cleaning method, and the mixed cavity cleaning method.
- the cleaning robot can perform the joint cleaning based on the determined joint cleaning method.
- the cleaning robot when the cleaning robot operates as a master, the cleaning robot may be controlled to transmit a control command to the slave cleaning robot to perform the common cleaning.
- the cleaning robot receives the cleaning status information from the slave cleaning robot through the communication network, by continuously monitoring the cleaning status, it is possible to perform additional cleaning if there is an area that needs more cleaning.
- the cleaning robot monitors an operation state of the slave cleaning robot, so that if there is an error occurring and the slave cleaning robot does not operate, the other cleaning robot can cover it.
- the cleaning robot may terminate the connection with the other cleaning robot.
- the cleaning robot may switch to the silent mode to adjust the noise generated as the cleaning is performed so as not to inconvenience the user.
- the noise mode is unnecessarily switched to the cleaning performance, the user may be rather troubled.
- FIG. 16 is a diagram illustrating IoT devices directly connected through a home network or a communication network according to different embodiments
- FIG. 17 is a cleaning robot that senses a user's position and controls a cleaning mode according to an embodiment.
- FIG. 18 is a diagram illustrating a control block diagram of FIG. 18, and FIG. 18 is a diagram illustrating a room in which a cleaning robot and an IoT apparatus are provided, according to an exemplary embodiment. In the following description, together with the description to avoid overlapping.
- the cleaning robot 1g includes a suction motor 35, an input unit 43, a battery 50, a communication unit 52, a sensor module 61, a voice recognition unit 62, a speaker 63, and a control unit. 64, and a drive motor 100.
- the communication unit 52, the sensor module 61, and the control unit 64 may be integrated in a system on chip (SOC) built in the cleaning robot 1g, but the cleaning robot 1g. Since there is not only one system-on-chip embedded in), but may be a plurality of system-on-chip, it is not limited to being integrated into only one system-on-chip.
- the above-described components may be implemented through a processor such as an MCU.
- the voice recognition unit 62 may be implemented through a microphone.
- the voice recognition unit 62 may convert the received voice into an electrical signal.
- the speech recognition unit 62 may derive the speech waveform or convert the speech waveform into text. Accordingly, the voice recognition unit 62 may receive noise generated from various sound sources in addition to the user's voice.
- the sensor module 61 includes at least one sensor capable of sensing a user.
- the sensor module may be implemented with a processor such as a sensor capable of sensing a user and an MCU controlling the operation of the above-described sensors.
- the sensor detects a user, such as a stereo camera, an infrared sensor, a thermal sensor, a pyro electric infrared ray sensor, a 3D sensor, and the like, and detects a distance between the user and the sensor or between the user and the cleaning robot to which the sensor is attached.
- the sensor module 61 may include at least one of the above-described sensors, not only to detect the location of the user, but also to detect the user's motion.
- the information detected by the sensor will be referred to as sensor information.
- the cleaning robot 1 has a built-in communication unit 52, it is possible to connect to the home network through the communication unit 52. Accordingly, the cleaning robot 1 may exchange various signals with various Internet of Thing (IoT) devices existing in the room.
- IoT Internet of Thing
- the IoT apparatus refers to various apparatuses existing in life, which are connected to a home network through a built-in communication unit and share a data.
- the IoT apparatus includes not only home appliances such as telephones, microwave ovens, air conditioners, TVs, lights, door locks, but also all user terminals such as mobile phones, computers, and wearable devices.
- the home network refers to a network that provides a path for exchanging data with all IoT devices indoors and provides a path for access to an external internet network.
- the gateway server of the home network is a server that integrates and manages the home network, and any one of the IoT devices may perform the operation of the gateway server, or another server may exist separately to perform the operation of the gateway server.
- the cleaning robot may control various operations by transmitting various control commands to other IoT apparatuses located in the room through the gateway server of the home network.
- the cleaning robot may control the operation through direct communication with the IoT apparatus, and is not limited to controlling the operation through the gateway server of the home network.
- the communication unit 52 may receive the sensing information such as the location and operation of the user detected by the IoT apparatus from the gateway server or the IoT apparatus of the home network.
- the controller 64 may control the overall operation of the cleaning robot 1g. Specifically, the control unit 64 may control the operation of all the components in the cleaning robot 1 such as the communication unit 52 as well as the various modules built in the cleaning robot 1g. The controller 64 may generate a control signal for controlling the components of the cleaning robot 1g to control the operations of the above-described components.
- the controller 64 includes a detector 65 and a cleaning controller 66 as shown in FIG. 17.
- the cleaning controller 66 may control a cleaning method, a fuel mode, a performance ranking for each cleaning area, and the like, and may control the movement to a desired area by controlling the driving motor 100.
- the cleaning controller 66 may include the moving module d of FIG. 1.
- the cleaning controller 660 may control the driving of the cleaning robot 1g by controlling the driving motor 100, the caster wheel, and the like through the control signal.
- the sensor 65 may detect a user based on at least one of at least one IoT apparatus present in the room and sensor information through the sensor module. For example, the sensing unit 65 receives sensor information acquired through the sensor of the IoT apparatus through the communication unit 52, or where the user is indoors using sensor information built in the cleaning robot 1g. It can detect whether it is located or in what state.
- the sensing unit 65 may derive not only the distance between the user and the cleaning robot, but also what kind of state the user is using by using sensor information obtained through various sensors existing in the room.
- the controller 64 may control the cleaning controller 66 using at least one of sensor information received through the communication unit 52 and sensor information detected through the sensor module. In other words, the controller 64 may control the cleaning operation based on at least one distance between the user sensed through the above-described components and the cleaning robot.
- control unit 64 controls the cleaning control unit 66 through a control signal to switch to the silent mode.
- control unit 64 may determine at least one of at least one of a distance between the user and the cleaning robot detected by the above-described components, a user's behavior, a user's location, and a result received through the voice recognition unit 62. It is possible to control the cleaning performance on a basis.
- the controller 64 determines that the user is currently sleeping or resting. At the same time, the control is performed so as to switch to the silent mode and to perform cleaning on other areas other than the bedroom.
- a user may be sitting on a sofa and reading a newspaper.
- a user may be sensed by a sensor module embedded in an IoT apparatus such as an air conditioner or a sensor module embedded in a cleaning robot.
- the voice recognition unit 62 of the IoT apparatus or the voice recognition unit 62 of the cleaning robot 1g may receive TV sound. Accordingly, the sensing unit 65 may control the cleaning operation by combining the results detected or input through the above-described components.
- the control unit 64 when the distance between the user and the cleaning robot is closer than the preset level according to the sensor information, and the ambient noise exists as a result of the recognition through the voice recognition unit 62, the control unit 64 is in the silent mode through a control signal. You can switch to
- the controller 64 may control the level of the silent mode in proportion to the magnitude of the ambient noise.
- the controller 64 may control the level of the silent mode in proportion to the magnitude of the ambient noise.
- the controller 64 may determine that the user is at rest. have. Accordingly, the controller 64 may control to clean only the area except the room where the user is located or the area where the user is located in the cleaning area. Alternatively, the control unit 64 may adjust the cleaning execution order so as to clean the room around the user's area or the room where the user is located later in the cleaning area.
- 19 is a flowchart illustrating an operation of a cleaning robot for controlling a cleaning mode by detecting a location of a user according to an exemplary embodiment.
- the cleaning robot may detect at least one of a location and a state of the user by using at least one of the IoT apparatus and the sensor module (1800).
- the cleaning robot may detect a user's location and state through various sensors capable of detecting the user, such as a stereo camera, a 3D sensor, and a PIR sensor.
- the cleaning robot may determine that the object with movement among the objects detected around by the 3D sensor corresponds to the user.
- the cleaning robot may detect a user through image processing from image information acquired through a stereo camera.
- the cleaning robot may detect the user based on infrared rays emitted from the user's body through the PIR sensor.
- the cleaning robot may receive the location and status information of the user sensed by the IoT apparatus through the communication network.
- the aforementioned sensor may be built in the IoT apparatus.
- the cleaning robot may receive a location and state information detected by the IoT apparatus, and detect a location and a state of the user based on the location information. Accordingly, the distance between the user and the cleaning robot can be derived by comparing the current position of the cleaning robot with the location of the user.
- the cleaning robot may control the cleaning operation based on the user's state (1810). For example, if it is determined that the distance between the cleaning robot and the user is closer than a preset level during the cleaning, the cleaning robot may switch to the silent mode, thereby preventing noise from occurring to the user. Accordingly, the cleaning robot according to the embodiment may increase the convenience of the user by performing the cleaning that does not affect the user's life.
- the cleaning robot may not directly switch to the silent mode and may determine the user's state in more detail to control the cleaning. .
- the cleaning robot may receive noise, voice, etc. emitted from the surrounding sound source through the voice recognition unit.
- the cleaning robot decides to switch to the silent mode around the user, and adjusts the output of the suction motor and also adjusts the output of the driving motor to prevent noise from the user. Can be controlled.
- the cleaning robot may adjust the level of the silent mode according to the loudness of the user. That is, the cleaning robot can control the operation of the built-in devices so that cleaning is performed within a range that does not cause inconvenience to the user due to the operation of the cleaning robot. As another example, the cleaning robot may adjust the level of the silent mode according to the amount of noise radiated from the surrounding sound source.
- the cleaning robot may adjust the cleaning area itself based on the current position of the user. For example, if it is determined that the current location of the user is a desk or a bed, the cleaning robot may change the order of the cleaning areas by first cleaning the area where the user is located or an area outside the room.
- FIG. 20 is a flowchart illustrating an operation of a cleaning robot that detects a user's position and controls a cleaning mode while performing cleaning according to an embodiment
- FIG. 21 is a level of a silent mode by sensing ambient noise according to an embodiment
- FIG. 22 is a diagram illustrating a case of adjusting the cleaning area
- FIG. 22 is a view illustrating a case of changing a cleaning area by detecting a state of a user, according to an exemplary embodiment.
- the cleaning robot can begin performing cleaning (1910). For example, when the cleaning robot receives an execution command from a user through an input unit or receives an execution command from an IoT apparatus through a communication unit, the cleaning robot may start cleaning. In addition, the cleaning robot can start cleaning at a time preset by the user, and there is no limitation.
- the cleaning robot may perform cleaning (1910).
- the cleaning robot can perform cleaning according to various cleaning modes, and can set the order of the cleaning area among the entire cleaning areas. For example, when the cleaning area is divided into a plurality of rooms, the cleaning robot may set the cleaning order for each room according to various methods, and then perform cleaning based on the cleaning robot. As another example, when the cleaning area corresponds to one room, the cleaning robot may set which direction to perform cleaning according to various methods, and then perform cleaning based on the cleaning robot.
- the cleaning robot can complete the cleaning. Accordingly, the cleaning robot may move to the charging station or move to a predetermined area according to the remaining amount of battery.
- the cleaning robot may detect a user when the cleaning robot performs cleaning on the cleaning target area (1930).
- the cleaning robot may detect a user in conjunction with an IoT apparatus or through a built-in sensor module. Detailed description thereof will be omitted as it is the same as described above.
- the cleaning robot may control a cleaning operation according to whether the user detects the cleaning robot. If the user is not detected, the cleaning robot may perform cleaning on the set cleaning mode and the cleaning target area without additionally controlling the operation.
- the cleaning robot may control the cleaning method, the cleaning mode, and the cleaning target area so that the user has less influence (1940).
- the cleaning robot may control a cleaning operation based on at least one of the distance between the user and the cleaning robot, the current location of the user, the sound input through the microphone, and the user's movement.
- the cleaning robot may be switched to the silent mode, so that the user may not be affected by the noise.
- the cleaning performance of the cleaning robot is lowered when switching to the silent mode. Accordingly, even if the distance between the user and the cleaning robot is closer than the preset distance, by determining whether the user's movement or ambient noise, the cleaning robot can adjust the level of the silent mode or adjust the cleaning order.
- the preset distance may be set by the user or may be set in advance when designing the cleaning robot.
- the cleaning robot may determine that the user is resting or going to bed and control to clean another area first.
- the cleaning robot may control to clean other areas first. There is no limit as such.
- the cleaning robot may set the level of the silent mode according to the amount of ambient noise input. For example, when the user is washing dishes, the cleaning robot may set the level of the silent mode according to the amount of noise generated by the washing dishes.
- the cleaning robot may set the level of the silent mode according to the volume of the TV sound. Accordingly, the cleaning robot can increase the performance of cleaning while at the same time making the user's TV uncomfortable.
- the cleaning robot may determine that the user is currently located in the bed through a sensor embedded in the bed. Accordingly, the cleaning robot may exclude the room in which the user is currently located from the cleaning target area, or allow the cleaning to be performed last among the cleaning areas.
- the cleaning robot according to the disclosed embodiment provides a control of the indoor environment as well as switching to the silent mode according to the position and state of the user for the convenience of the user.
- a cleaning robot for controlling the indoor environment based on the indoor environment map will be described.
- FIG. 23 is a view showing a control block diagram of a cleaning robot for adjusting an indoor environment based on an environment map according to an embodiment
- FIG. 24 is a view showing a vehicle equipped with a sensor module according to one embodiment
- FIG. 25 is a diagram for describing a case of collecting environmental information through a cleaning robot, a flying vehicle, and an IoT apparatus, according to an exemplary embodiment.
- the cleaning robot 1h includes a display 42, an input unit 43, a communication unit 52, a memory 59, a voice recognition unit 62, a speaker 63, and a sensor module 67. ), A vehicle 68, a vehicle charging station 69, and a controller 70.
- the communication unit 52, the memory 59, the sensor module 67, and the control unit 70 may be integrated in a system on chip embedded in the cleaning robot, but may be integrated in one system on chip as described above. It is not limited to that.
- the display 42, the input unit 43, the communication unit 52, the memory 59, the voice recognition unit 62, and the speaker 63 are the same as described above, they will be omitted.
- the vehicle 68 refers to an object that can receive a control command from the cleaning robot 1h and fly to a target area.
- the aircraft 68 includes a drone equipped with a sensor module.
- the aircraft 68 may be equipped with a sensor module at a lower portion thereof to obtain environmental information of an area in which the sensor module 67 is located.
- the sensor module 67 may be implemented as a substrate in which various sensors capable of obtaining environmental information and a processor such as an MCU are integrated. Detailed description of the sensor module will be described later.
- the sensor module 67 may be mounted to the lower end of the vehicle 68 by a lift. Accordingly, the vehicle 68 may control the position of the sensor module 74 by adjusting the height of the lift as well as adjusting the flying height of the vehicle 68.
- the cleaning robot 1h may be provided with a vehicle charging station 69.
- the vehicle charging station 69 may be provided on the upper surface of the main body of the cleaning robot (1h).
- the aircraft 68 may fly in the entire house and lack the battery to obtain environmental information.
- the cleaning robot 1h is provided with a vehicle charging station 69 that charges a battery built in the vehicle, and can be charged.
- the cleaning robot 1h can move with the aircraft 68 mounted.
- the fixed body may be provided so that the flying body 68 falls off from the cleaning robot 1h while the cleaning robot 1h moves. That is, the vehicle 68 and the cleaning robot 1h may be fixed through the fixing body. More specifically, the control unit 70 may control the operation of the fixed body through a control signal to fix the vehicle to the main body of the cleaning robot 1h.
- the control unit 70 has a built-in aircraft module (p) in Figure 1, it is possible to control the overall operation of the aircraft.
- the controller 70 may control the overall operation of the cleaning robot 1h. Specifically, the control unit 70 may control the operation of all components to the cleaning robot 1h, such as the display 42, the aircraft 68, the fixed body, as well as various modules built in the cleaning robot 1b. . For example, the controller 70 may generate a control signal for controlling the components of the cleaning robot 1h to control the operations of the above-described components.
- the controller 70 may be implemented through a processing device that performs various calculations and control processes, such as a processor built in the cleaning robot 1b, and may be implemented through various known processing devices. ,
- control unit 70 includes a desired information setting unit 71, the generation unit 72, and the indoor environment control unit 73.
- the desired information setting unit 71 may set desired information about the indoor environment.
- the desired information about the indoor environment refers to information about the indoor environment desired by the user, and means the degree of hope of various environments such as indoor temperature, air cleanliness, and illuminance.
- the desired information setting unit 71 may receive information related to the indoor environment from the input unit 43, the communication unit 52, etc. to the user and set the desired information. On the other hand, there is no restriction on how to receive information related to the indoor environment to the user.
- the information about the indoor environment means environment setting information desired by the user.
- the input unit 43 may receive a desired temperature from the user at 25 degrees, but may be input as a parameter such as 'very cool, cool, comfortable, warm, very warm'. Then, the desired information setting unit 71 may set a value corresponding to each parameter. That is, since the user does not need to specifically enter each environment parameter value, the user's convenience can be increased.
- the desired information setting unit 71 may set the temperature corresponding to 'very cool' to 19 degrees, and may set the temperature corresponding to 'cool' to 23 degrees. At this time, the environment corresponding to each parameter may be set by the user or may be set in advance when designing the cleaning robot 1h.
- the generation unit 72 may generate indoor environment map by obtaining indoor environment information.
- the indoor map information obtained through the SLAM module or the communication network is stored in the memory 59 of the cleaning robot.
- the generation unit 72 may generate the indoor environment map by mapping the indoor environment information obtained through various kinds of methods to the map information.
- the indoor environment information may be obtained using at least one of at least one IoT apparatus, a sensor module 67 embedded in the cleaning robot, and a sensor module mounted on the aircraft.
- the IoT apparatus may include various sensors capable of acquiring indoor environment information. Sensors include temperature sensors, humidity sensors, gas sensors, dust sensors, noise sensors, illuminance sensors, odor sensors, and radon level sensors.
- the temperature sensor refers to a sensor that detects a temperature of an object or ambient air.
- the temperature sensor can sense the temperature and convert it into an electrical signal.
- the temperature sensor has a contact type for sensing the temperature based on the resistance value or the voltage generated by the temperature difference depending on the measuring method, and a non-contact type for emitting infrared energy radiated from the heat source. There is no limit to the temperature sensor.
- the humidity sensor refers to a sensor that detects humidity by using physical and chemical phenomena of moisture in the air.
- the humidity sensor includes a wet and dry bulb humidity, a hair humidity, a lithium chloride humidity sensor, and the like, according to a method of detecting humidity, and the humidity sensor integrated in the sensor module is not limited.
- the gas sensor refers to a sensor that detects a specific chemical contained in a gas and detects a gas concentration based thereon. Specifically, the gas sensor refers to a sensor for detecting a specific chemical contained in the gas to measure the concentration, and converts the signal into an electrical signal in proportion to the concentration.
- the gas sensor includes a method of using a change in solid physical properties due to adsorption or reaction of a gas, a method of using combustion heat, an method of using an electrochemical reaction, and a method of using physical characteristic values according to a detection method. There is no limit to the gas sensor.
- the dust sensor refers to a sensor for detecting the amount of floating dust.
- the dust sensor measures the amount of dust based on the pattern of scattered light scattered differently due to the size of the dust, and there are side scattered light and near-infrared scattered light, and there is no limit to the dust sensor integrated in the sensor module. .
- the noise sensor refers to a sensor that detects a magnitude of noise by measuring a change in capacitance according to sound pressure. Electrostatic and piezoelectric types exist in the noise sensor according to a detection method, and there is no limitation in the noise sensor integrated in the sensor module.
- the illuminance sensor refers to a sensor that senses the brightness of the light and converts it into an electrical signal, and measures the illuminance based on this.
- the illuminance sensor may change the output current according to the intensity of light, and measure the illuminance based on this.
- the odor sensor refers to a sensor that detects odor and quantifies the intensity of the odor and the degree of odor.
- the odor sensor may generate a chemical reaction with the substance in the odor, and output the electric signal to the value.
- a radon numerical sensor means a sensor which measures radon which is an inert gas element which emits radiation.
- all sensors capable of acquiring indoor environmental information are included, without limitation.
- the sensor module 67 may be configured as a processor such as a MCU for processing the output value output from the sensor and the sensor capable of acquiring the indoor environment information as described above.
- the sensor module 67 may be mounted on the cleaning robot 1h, may be mounted on the flying body 68, or may be mounted on both the cleaning robot 1h and the flying body 68.
- the generation unit 72 receives environmental information from an IoT apparatus such as an air conditioner, receives environmental information using a sensor module 67 mounted on a cleaning robot, or the aircraft 68 Indoor environment map can be generated by receiving environmental information from the sensor module installed in the). At this time, the generation unit 72 may combine the indoor environment information obtained from the aforementioned devices in order to generate a more accurate indoor environment map.
- an IoT apparatus such as an air conditioner
- receives environmental information using a sensor module 67 mounted on a cleaning robot or the aircraft 68 Indoor environment map can be generated by receiving environmental information from the sensor module installed in the).
- the generation unit 72 may combine the indoor environment information obtained from the aforementioned devices in order to generate a more accurate indoor environment map.
- the air conditioner measures the room temperature 26 degrees
- the sensor module 67 may measure 25 degrees.
- the generation unit 72 may select any one of the two information, if the error of the indoor environment information obtained from both devices is within a predetermined level. However, if the error of the indoor environment information exceeds a preset level, the generation unit 72 may request re-measurement of the indoor environment definition.
- the preset level may be set for each environmental information, and may be determined by the user or may be determined when designing the cleaning robot 1h.
- indoor environment information may be obtained from an IoT apparatus, a sensor module 67 embedded in a cleaning robot, and a sensor module 74 mounted on a vehicle 68.
- the generation unit 72 may control the flight of the aircraft 68 to obtain the indoor environment information from the intermediate position between the IoT apparatus, the cleaning robot (1h) through the control unit 70.
- the generation unit 72 may select the indoor environment information determined to have the highest accuracy among the three indoor environment information obtained from each device, and map it to the map information.
- the relationship between the output value of the Internet of Things device, the output value of the sensor module 67 built in the cleaning robot (1h), the output value of the sensor module 74 mounted on the vehicle 68 is the 'Internet of Things Output value of the device ⁇ output value of the sensor module 74 mounted on the vehicle 68 ⁇ output value of the sensor module 67 built in the cleaning robot (1h).
- the relationship between the output values is expressed as 'the output value of the sensor module 67 embedded in the cleaning robot 1h ⁇ the output value of the sensor module 74 mounted on the vehicle 68 ⁇ the output value of the IoT apparatus'. Can be.
- the generation unit 72 may generate an indoor environment map by selecting an output value of the sensor module mounted on the vehicle 68 corresponding to the intermediate value and mapping it to map information.
- the relationship between the output values is 'output value of the sensor module 74 mounted on the vehicle 68 ⁇ output value of the sensor module 67 built in the cleaning robot 1h ⁇ output value of the IoT apparatus'. It can be expressed as.
- the generation unit 72 selects the output value of the sensor module 67 embedded in the cleaning robot 1h closest to the output value of the sensor module 74 embedded in the aircraft 68, and based on this, You can create an environment map. Meanwhile, the generation unit 72 may select a value that is determined to have the highest accuracy by performing the above-described process for each parameter such as temperature information and odor information constituting the environmental information.
- the indoor environment controller 73 may perform the adjustment process of the indoor environment by comparing the indoor environment map obtained through the generation unit 72 with desired information about the indoor environment received from the user. At this time, the indoor environment control unit 73 controls not only the cleaning robot but also the indoor environment in cooperation with the IoT apparatus.
- the indoor environment controller 73 transmits a control command to the air conditioner through a communication network to control the operation of the air conditioner, so that the user can have the desired room temperature.
- the indoor environment control unit 73 transmits a control command to the indoor lamp to adjust the illuminance so that the indoor brightness desired by the user is desired.
- the indoor environment control unit 73 controls the device in the cleaning robot 1h so that the indoor dust amount or the humidity is an environment desired by the user, and there is no limitation.
- the indoor environment control unit 73 may use the humidification module n to adjust the indoor environment such that the humidity is desired by the user.
- the indoor environment controller 73 may lower the humidity by using the dehumidification module o.
- the indoor environment control unit 73 may control the operation of the dehumidifier located in the vicinity through the communication network, thereby lowering the humidity.
- the modular robot is applied to the cleaning robot 1h, and various modules are integrated. There is no limitation, such as to lower the humidity through the dehumidification module (o) mounted to the cleaning robot (1h).
- FIG. 26 is a flowchart illustrating an operation of a cleaning robot that collects indoor environment information and adjusts an indoor environment according to an embodiment
- FIG. 27 is a diagram illustrating a case where a user's state is adjusted to sense an indoor environment according to an embodiment. A diagram for explaining.
- the cleaning robot may collect indoor environmental information using a sensor capable of detecting the indoor environment (2500).
- the type of sensor includes various sensors capable of sensing an indoor environment as described above, but there is no limitation.
- the cleaning robot may collect environmental information by using or combining at least one of a sensor mounted on a main body, a sensor mounted on a vehicle, and a sensor mounted on an IoT apparatus.
- the cleaning robot may generate an indoor environment map by mapping the collected environment information to the map information stored in the memory (2510). Accordingly, the cleaning robot can specifically grasp the current indoor environment state.
- the cleaning robot may receive the desired information about the indoor environment from the user.
- the cleaning robot may receive desired information about an indoor environment from an IoT apparatus such as an input unit, a remote controller, or a user terminal.
- an IoT apparatus such as an input unit, a remote controller, or a user terminal.
- the cleaning robot does not perform the calibration process of the indoor environment and switches to the standby state. It may be (2530).
- the cleaning robot may perform a correction process so that the indoor environment is created to correspond to the desired information about the indoor environment.
- the cleaning robot may not only control the devices in the cleaning robot, but also control the operation of the IoT apparatus through the home network as shown in FIG. 27 to perform a correction process.
- the cleaning robot may perform a process of reducing dust or disinfecting the room through the cleaning mode.
- the cleaning robot may perform a process such as lowering or raising an indoor temperature by interworking with an IoT apparatus through a home network as shown in FIG. 27.
- the method of controlling the indoor environment by manipulating the cleaning robot and the Internet of Things device such as the robot may perform a process of reducing or increasing the illuminance of the room in conjunction with lighting in the room through the home network.
- FIG. 28 is a block diagram illustrating a cleaning robot for acquiring and providing indoor image information, according to an exemplary embodiment.
- FIG. 29 is a diagram for describing a case of acquiring image information according to a limited field of view according to an exemplary embodiment.
- 30 to 33 are views illustrating an image unit implemented according to different embodiments.
- 34 is a diagram illustrating a case of acquiring image information through an image unit implemented in a bar shape, and
- FIG. 35 illustrates a case of acquiring image information through a vehicle according to an embodiment. It is a figure for following.
- the cleaning robot 1i may include an input unit 43, a communication unit 52, a vehicle 68, a vehicle charging station 69, an image unit 74, and a controller 75.
- the communication unit 62 and the control unit 75 may be integrated in a system on chip embedded in the cleaning robot 1i, and is not limited to being integrated in one system on chip as described above.
- the imaging unit 74 obtains image information obtained by imaging a specific area.
- the imaging unit 74 may be implemented through a camera module.
- the imaging unit 74 may be mounted on one surface of the cleaning robot 1i or mounted on the vehicle 68. Meanwhile, the imaging unit 74 may be implemented in various forms and mounted on one surface of the cleaning robot 1i or mounted on the aircraft 68.
- the imaging unit 74 is implemented in the form of a rod as shown in FIG. 30 and has four degrees of freedom.
- the degrees of freedom in four directions mean degrees of freedom of height, rotation, tilt, and tilting.
- the imaging unit 74 has four motors built therein so that the four directions can be adjusted.
- a damper structure may be applied to the image part 74 to prevent vibration generated when the cleaning robot moves.
- the imaging unit 74 may be implemented in the form that the triangular support is attached to the bar.
- the image unit may be implemented in a form in which a triangular support is attached to a rod and a camera is mounted in the support. At this time, the imaging unit 74 may adjust the support through the actuator.
- the imaging unit 74 may be implemented in a tensioning manner by pulling a wire.
- the motor pulling the wire is disposed close to the main body of the cleaning robot, there is an advantage that the volume and size of the imaging unit 74 can be reduced.
- the imaging unit 74 may control height, inclination, rotation, and tilting by rotating a pillar having two asymmetric inclinations.
- the imaging unit 74 may be implemented in various forms to acquire image information by photographing various directions and heights, and is not limited thereto. That is, the imaging unit 74 may be implemented in various forms as described above, and may be mounted on at least one of one surface of the cleaning robot.
- image information may be acquired through a wider field of view through the image unit 74 implemented in the form of a bar.
- the imaging unit 74 may be mounted on the vehicle 68.
- the imaging unit 74 may be mounted at the bottom of the vehicle 68 to fly the entire house and acquire image information.
- the cleaning robot 1h is provided with a vehicle charging station 69 that charges a battery built in the vehicle, and can be charged.
- the cleaning robot 1h can move with the aircraft 68 mounted.
- the charging station 69 may charge the battery of the vehicle 68 through wired or wireless charging, without limitation.
- the controller 75 may transmit the induction signal such as an infrared signal, an ultrasonic signal, or a beacon signal through the IR sensor to the aircraft 68 to control the vehicle 68 to be seated on the vehicle charging station 69.
- the induction signal such as an infrared signal, an ultrasonic signal, or a beacon signal
- the image unit 74 is connected to the control unit 75, the control unit 75 may receive the image information obtained through the image unit 74, and control the communication unit 52 to provide to the user. .
- the communication unit 52 may be provided in the cleaning robot 1i.
- the communication unit 52 is the same as described above, a detailed description thereof will be omitted.
- the communication unit 52 may transmit and receive a signal including various data with the user terminal through a communication network.
- the communication unit 52 receives a remote access request from the user terminal and performs a connection, thereby connecting the cleaning robot 1i to the user terminal.
- the communication unit 52 may transmit and receive a signal including data with a vehicle through a wireless communication network.
- the communication unit 52 may transmit a control command to the aircraft through the wireless communication network.
- the communication unit 52 may receive image information from the image unit 74 mounted on the vehicle 68.
- the controller 75 may control the overall operation of the cleaning robot 1i. Since the general description of the control unit 75 is the same as described above, it will be omitted.
- the controller 75 may control the operation of the image unit 74 through the control signal, thereby controlling the acquisition of the image information.
- the controller 75 controls the communication unit 52 through a control signal, thereby transmitting and receiving data to an external device, the aircraft 68 and the like.
- the controller 75 may control a device in the cleaning robot 1i to stop at least one operation being performed in response to a user's request for remote access through a control signal.
- the communication unit 52 may be connected to the user terminal through a communication network.
- the control unit 75 stops an operation currently being performed through a control signal, and controls the device in the cleaning robot 1i to acquire image information according to a signal received from the user terminal.
- the controller 75 may control the operation of the driving motor 100 of the cleaning robot 1i to move the cleaning robot 1i to a region selected by the user. Alternatively, the controller 75 may control the operation of the vehicle 68 to move the aircraft 68 to a region selected by the user.
- the controller 75 may obtain image information regarding the area to be photographed according to the remote control of the user terminal linked through the communication unit 52.
- the user terminal may be stored in conjunction with the cleaning robot (1i), an application for controlling the operation of the cleaning robot (1i).
- the version of the application can be continuously updated. The update may be performed according to a request of a user, a request of an application distributor, or a predetermined period.
- the user can input an operation command of the cleaning robot 1i using the interface displayed on the display of the user terminal.
- the controller 75 may receive an operation command through the communication unit 52, and control the operation of the cleaning robot 1i in response thereto.
- the controller 75 when the controller 75 is performing another task, the controller 75 may stop another task in order to perform photographing according to the remote control of the user.
- the controller 75 may store performance information about another task in a memory and then perform the process again when the remote connection with the user terminal is completed.
- the controller 75 may perform the shooting by controlling the cleaning robot 1i according to the remote control input by the user through the application.
- the user may directly command a remote operation of the cleaning robot or set a region of interest through the user terminal.
- the user may set a photographing plan through the application.
- the cleaning robot may store map information obtained through a 3D sensor or a camera.
- the user terminal may receive map information through the communication unit 52 and display the region of interest on the display. Then, the user may set a region of interest in the map information. In this case, the user may reserve the shooting of the ROI at a specific time period through the application. In addition, the user may set the region of interest by priority through the application.
- the user terminal may display map information through a display.
- the user terminal may set one or more zones selected by the user as the region of interest among the displayed map information, and transmit the information about this to the communication unit.
- the user may also set the photographing height, the photographing angle, and the photographing time of the image unit.
- the user may set the shooting order for the plurality of zones. That is, the user terminal may set the shooting plan in various ways through the application.
- the controller 75 may acquire the image information by controlling the operation of the cleaning robot 1i by reflecting the photographing plan set by the user, and may transmit the obtained image information to the user terminal. Accordingly, the user may be provided with indoor image information even when the user is outside.
- the controller 75 may support the telepresence function by transmitting the image information acquired through the communication unit 52 to the IoT apparatus in real time.
- telepresence means a virtual video conferencing system between users. That is, the controller 75 may receive image information of another user from the IoT apparatus through the communication unit 52 and display it on the display 42.
- the cleaning robot according to the embodiment may provide not only a real time meeting but also various services such as remote medical treatment through an image connection with a doctor.
- FIG. 36 is a flowchart illustrating an operation of a cleaning robot for acquiring image information through a remote connection according to an embodiment.
- FIG. 37 is a view illustrating image information of a desired area through an image unit implemented in a rod form according to an embodiment. It is a figure for demonstrating the case of acquiring.
- the cleaning robot may receive a remote access request from the user (3500).
- the user may transmit a remote access request to the cleaning robot through an IoT apparatus such as a user terminal.
- the cleaning robot may directly perform a connection connection with the IoT apparatus or stop another operation according to whether the cleaning robot is performing another task (3510). For example, when performing another task such as a cleaning task, the cleaning robot may stop the task currently being performed and store the processing result of the task currently being performed and task information to be performed later (3520). Accordingly, after the remote connection of the user is terminated, the cleaning robot can resume the suspended work without a separate command.
- the cleaning robot may perform a remote connection with the IoT apparatus through the communication network (3530). Accordingly, the cleaning robot may acquire and transmit image information regarding a desired area according to the remote control of the user. In addition to acquiring image information for indoor security, as shown in FIG. 37, the user may grasp the position to be checked through the image information to determine where the left object is located.
- FIG. 38 illustrates a control block diagram of a cleaning robot that senses a sound and performs a process corresponding thereto according to an embodiment.
- the cleaning robot 1j may be provided with a communication unit 52, a memory 59, a voice recognition module 76, an image unit 74, a control unit 77, and a driving motor 100.
- a communication unit 52 may be provided with a communication unit 52, a memory 59, a voice recognition module 76, an image unit 74, a control unit 77, and a driving motor 100.
- the communication unit 52, the memory 59, and the control unit 77 may be integrated in a system on chip embedded in the cleaning robot 1j, but may be integrated in one system on chip as described above. It is not limited to this.
- the voice recognition module 76 may include a voice recognition unit 62 that recognizes a voice and identifies a point at which the recognized voice is generated, and a module controller 78 that controls the overall operation of the voice recognition module 76. Can be.
- the voice recognition unit 62 may recognize a voice. As described above, the voice recognition unit may be implemented through a microphone to convert the received voice into an electrical signal. The speech recognition unit 62 may derive the speech waveform or convert the speech waveform into text.
- the voice recognition unit 62 may identify a point where the voice is generated.
- the voice recognition unit 62 may be arranged in an array form and implemented through a microphone. Accordingly, the microphones arranged in an array form may identify a direction in which the voice is input when the voice is input, and identify the point where the voice is generated.
- the module controller 78 may control the overall operation of the voice recognition module 76.
- the module controller 78 may derive a recognition result for the voice recognized by the voice recognizer 62.
- the module controller 78 may derive the recognition result through the data stored in the memory, or may derive the recognition result through an external voice recognition server.
- the module controller 78 may be connected to a voice recognition server located outside through the communication unit to transmit a voice waveform or text and receive a recognition result corresponding thereto.
- the external voice recognition server is a server existing outside the cleaning robot, may be a gate server of the home network, or may be a server existing outside the house, there is no limitation.
- the recognition result corresponding to the main voice may be stored in the memory 59. Accordingly, the module controller 78 may derive a recognition result regarding the voice using data stored in the memory 59 without having to be connected to an external voice recognition server through a communication network. Accordingly, the cleaning robot 1j according to the embodiment can prevent the overload of the communication network and can derive the recognition result more quickly.
- the module controller 78 may store the voice input through the voice recognition unit 62 in the memory. Accordingly, the module controller 78 transmits the voice stored in the memory to the user terminal. Detailed description thereof will be described later.
- a voice corresponding to each recognition result may be mapped and stored in the memory 59.
- the module controller 78 may transmit a voice 'Who are you' through the speaker.
- the voice may be a voice of a user or a voice of a pre-stored person. That is, the memory 59 maps and stores voices suitable for each situation, and the module controller 78 may transmit the voices suitable for the situation through the speaker. That is, the module controller 78 may transmit a preset voice in response to the recognition result regarding the voice.
- the voice suitable for each situation may be preset in the design of the cleaning robot 1j and stored in the memory.
- the controller 77 may control the overall operation of the cleaning robot 1j.
- the controller 77 may control the operation of all the components of the cleaning robot 1j such as the speaker 63 as well as various modules built in the cleaning robot 1j.
- the controller 77 may generate a control signal for controlling the components of the cleaning robot 1j to control the operations of the above-described components.
- the controller 77 may control the speaker 63 to transmit a voice corresponding to the recognition result derived through the module controller 78.
- the controller 77 may interwork with the communication unit 52 to transmit image information obtained through the image unit 74 to the user terminal.
- the controller 77 may perform a telephone connection to a preset contact based on the recognition result derived through the module controller 78.
- the preset contact includes a phone number or ID of a messenger application. That is, the controller 77 may connect a wired telephone to a preset telephone number, deliver a message through a messenger application, or connect an internet telephone.
- the contact may be set through the input unit of the cleaning robot 1j or through an application installed in the user terminal.
- FIG. 39 is a flowchart illustrating an operation of a cleaning robot that senses a sound and performs a process corresponding thereto according to an embodiment.
- FIG. 40 is a view illustrating a cleaning robot that responds to a sound detected at a front door according to an embodiment. It is a figure for demonstrating operation
- the cleaning robot may detect a sound through the voice recognition unit (3800).
- the sound includes not only a voice uttered by the user, but also all sounds from various sound sources such as noise.
- the cleaning robot may obtain the direction information of the sound through the speech recognition unit arranged in an array form. Accordingly, the cleaning robot can identify the direction information of the sound, that is, the point where the sound source is generated, and move to the vicinity of the aforementioned point.
- the cleaning robot may detect a sound generated at the front door. Accordingly, the cleaning robot can record the sound generated by moving near the front door and transmit the recorded sound to the user terminal through the communication network.
- the cleaning robot may record a sound generated at a point moved through the voice recognition module (3810). Accordingly, the cleaning robot can transfer the recorded data to the user terminal.
- the remote correspondence service refers to a service that supports remote correspondence of a user through a remote connection with the user terminal. Support of the remote response service may be preset by the user or may be preset in the design of the cleaning robot.
- the cleaning robot may perform a connection with the user terminal through the communication network (3830). Accordingly, when the connection with the user terminal is completed, the cleaning robot may be operated according to a control command received from the user terminal. Accordingly, even if the user is located outside through the user terminal, the user can cope with an emergency situation through a remote response.
- the cleaning robot when not connected to the user terminal, the cleaning robot is connected through a built-in speaker or a communication network may transmit a predetermined voice through the speaker existing in the room (3850).
- the sound to be transmitted may be preset according to the type of sound or the point where the sound is identified. In other words, the cleaning robot sends out a voice suitable for the situation, so that the cleaning robot can appropriately prepare for an emergency.
- the cleaning robot transfers the corresponding result through the preset voice to the user terminal (3860), thereby helping the user to grasp the indoor condition.
- the cleaning robot according to the embodiment not only detects the sound through the voice recognition module and performs the corresponding process, but also detects the user's voice to increase the user's convenience and provides a corresponding process. It may be.
- FIG. 41 illustrates a control block diagram of a cleaning robot that performs a process corresponding to a voice recognition result
- FIG. 42 illustrates a radiation pattern transmitted from an IoT apparatus, according to an exemplary embodiment. to be.
- the cleaning robot 1k includes a battery 50, a communication unit 52, a memory 59, a speaker 63, an image unit 74, a voice recognition module 76, and an image unit 74. , A display 42, a controller 79, and a drive motor 100.
- the voice recognition module 76 and the controller 79 may be integrated in a system on chip embedded in the cleaning robot 1k and may be operated by a processor. However, as described above, since only one system on chip embedded in the cleaning robot 1k is present and may be plural, the present invention is not limited to only one system on chip.
- the controller 79 may control the overall operation of the cleaning robot 1k. Since the general description of the control unit 79 is the same as described above, it will be omitted.
- the controller 79 may include a cleaning controller 66 and a recognizer 80.
- the cleaning control unit 66 controls the overall operation related to cleaning, and the recognition unit 80 may receive a user's command and recognize the user's location.
- the recognition unit 80 may receive various kinds of call signals and recognize the location of the user from the call signals.
- the call signal means a signal that the user calls the cleaning robot 1k to transmit a command to the cleaning robot 1k.
- the call signal includes a voice signal of a user, as well as a wireless signal transmitted from an IoT apparatus such as a user terminal.
- the recognition unit 80 may recognize a user's call signal in cooperation with components in the cleaning robot 1k and determine a user's location. In one embodiment, the recognition unit 80 may recognize the user's voice through the voice recognition module 76. In this case, the voice used as the call signal may be preset and stored in the memory 59.
- the recognition unit 80 may not only recognize a preset voice, but also identify a location of a sound source, that is, a user's location through the voice recognition unit 62 arranged in an array. Then, the control unit 79 may control the cleaning robot 1k to move near the point where the user is located by controlling the driving motor 100 through a control signal.
- the recognition unit 80 may recognize a call signal by detecting a motion of the user through the image unit 74.
- the motion used as the call signal may be preset and stored in the memory 59. Accordingly, when the recognition unit 80 recognizes the call signal, the control unit 79 may derive the position of the user from the image information. Then, the control unit 79 may control the cleaning robot 1k to move near the point where the user is located by controlling the driving motor 100 through a control signal.
- a user may operate a IoT apparatus such as a user terminal to transmit a wireless signal.
- the recognition unit 80 may determine the location of the user who is operating the IoT apparatus based on the wireless signal received through the communication unit 52.
- the recognizer 80 may calculate a radiation pattern radiated from the antenna of the IoT apparatus. Radiated power is power emitted as the antenna radiates a signal, and the radiation pattern expresses the radiated power of the antenna of the IoT apparatus as a function of direction as a signal is exchanged between the IoT apparatus and the communication unit 52. .
- the recognition unit 80 may determine the direction in which the IoT apparatus is located using the radiation pattern. For example, in FIG. 42, the cleaning robot 1k is positioned at the center of the radiation pattern, and the IoT apparatus which transmits the call signal in the direction where the signal strength is strongest in the radiation pattern is located. In this case, the strength of the signal means the strength of the received signal strength (RSI). Accordingly, when the recognition unit 80 recognizes the user's position from the call signal, the control unit 79 controls the driving motor 100 through the control signal so that the cleaning robot 1k moves near the point where the user is located. Can be controlled. On the other hand, in addition to the recognition unit 80 may determine the direction and the position in which the radio signal is transmitted through a variety of known methods, it is not limited to the above.
- the controller 79 may receive a user voice through the voice recognition module 76 and provide a service corresponding to a recognition result regarding the received voice. For example, the controller 79 may receive a voice command related to the IoT apparatus, as well as a voice command related to cleaning, and provide a service corresponding thereto. That is, the controller 79 may not only operate the cleaning robot 1k according to the voice recognition result, but also operate the IoT apparatus by transmitting a control command to the IoT apparatus through the home network.
- control unit 79 is provided with a security module 81, so that the use authority can be restricted for each service.
- the security module 81 may perform security authentication of the user by combining at least one of voiceprint recognition and voice recognition.
- the security module 81 may register at least one of a voice and a face for each user by using at least one of the imaging unit 74 and the voice recognition module 76, and set a use right for each user.
- the use right is a right regarding a service that can be provided by the user, and can be set together when initially registering at least one of a voice and a face of the user, and can be changed later through a security authentication procedure.
- the controller 79 may determine whether the user has a right to receive a service corresponding to the recognition result regarding the voice, and determine whether to provide the service based on the determination result. Accordingly, the controller 79 may provide a service corresponding to the voice recognition result if the user has a use right.
- the security module 81 may register the use right of the user through a security authentication procedure.
- the security authentication procedure may be performed by at least one of the user's voice and voiceprint recognition.
- the controller 79 when the user requests a schedule check, the controller 79 performs real-time mirroring with a schedule management application installed on the user terminal through the communication unit 52 to grasp the schedule of the user, and transmits it through the speaker 63. can do.
- the controller 79 may recognize the change through the voice recognition module 76 and transmit the change to the user terminal by reflecting the change made by the user, thereby updating the schedule management.
- the controller 79 recognizes this and transmits information on the change to the wearable device through the communication unit 52, thereby setting up an alarm application installed in the wearable device. There is no limit, such as change.
- FIG. 43 is a flowchart illustrating an operation of a cleaning robot operating through security authentication according to an embodiment
- FIG. 44 illustrates a case in which a cleaning command and an IoT apparatus are controlled by receiving a user's voice command. A diagram for explaining.
- the cleaning robot may recognize the call signal (4100).
- the user may transmit various kinds of call signals to transmit a voice command to the cleaning robot, and the cleaning robot may recognize this.
- the call signal includes a voice signal of a user, as well as a wireless signal of an IoT apparatus.
- the cleaning robot may stop the operation in operation 4110 and store processing results related to the operation in order to resume the operation later.
- the cleaning robot needs to approach the user and receive a voice command, so that the identification of the user's location is necessary.
- the cleaning robot recognizes the user's voice by using the voice recognition unit arranged in an array form, thereby determining the location of the user.
- the cleaning robot may determine the user's position by recognizing the user's motion through the image unit.
- the cleaning robot can determine the location of the user through various known methods, such as the location of the user based on the strength of the signal emitted through the antenna of the IoT apparatus as described above.
- the cleaning robot may rotate and move to a position where the voice of the user may be recognized (4120).
- the cleaning robot may send a voice requesting to input a voice command through the speaker or display a pop-up message requesting the voice command on the display. Can be.
- the cleaning robot may receive a voice command of the user through the voice recognition module (4130).
- the cleaning robot may derive a voice recognition result corresponding to the voice command of the user by using the data stored in the memory or transmit the voice command to an external server, and receive the voice recognition result corresponding to the voice command of the user. .
- the cleaning robot may perform security authentication to determine whether the user has a right to use a service corresponding to the voice recognition result (4140).
- the cleaning robot may not only prevents an unspecified number of people from providing a personal service, but also prevents indiscriminate viewing of personal information.
- the cleaning robot may protect privacy by performing security authentication whether the user has permission to view the personal schedule. .
- the cleaning robot may provide a service to the user by performing an operation corresponding to the voice command (4180).
- the cleaning robot may provide a service through an IoT apparatus connected through a home network as well as a service through the cleaning robot.
- the cleaning robot may again perform the operation previously stopped (4190).
- the cleaning robot may perform a registration procedure. Accordingly, when the usage right is registered, the cleaning robot may provide a service to the user by performing an operation corresponding to the voice command. On the other hand, if the usage right is not registered, the cleaning robot does not provide a service corresponding to the voice command, and may perform the operation previously stopped again (4170).
- FIG. 45 illustrates a control block diagram of a cleaning robot that determines a user's state based on biometric information of a user
- FIG. 46 illustrates a voice recognition unit arranged in an array according to an embodiment. A diagram for describing a case of receiving a voice command of a user.
- the cleaning robot 1l includes a display 42, an input unit 43, a communication unit 52, an image unit 74, a voice recognition module 76, a speaker 63, a first memory 82, and a sensor module 84. ), A controller 85, a 3D sensor 86, and a drive motor 100.
- the communication unit 52, the voice recognition module 76, the first memory 82, the sensor module 84, and the control unit 85 may be integrated in a system on chip embedded in the cleaning robot 1l. It may be operated by a processor. However, since there is not only one system on chip embedded in the cleaning robot 1l, but a plurality of system on chips may exist, the present invention is not limited to only one system on chip.
- the voice recognition module 76 may include a voice recognition unit 62, a module controller 78, and a second memory 83.
- the voice recognition unit 62 may be implemented through microphones arranged in an array form as described above. Accordingly, the voice recognition unit 62 may identify the point where the voice is generated.
- a microphone m arranged in an array form may be mounted on one side of the cleaning robot 11. Accordingly, the microphones m arranged in an array form may identify a direction in which the voice is input when the voice is input, and identify a point where the voice is generated.
- the module controller 78 may control the overall operation of the voice recognition module 76. Since the general description of the module controller 78 is the same as described above, it will be omitted.
- the module controller 78 may detect abnormal signs by analyzing the voice of the user input through the voice recognition unit 62. For example, the module controller 78 may analyze the voice of the user to determine whether it is in an emergency, call for help, or scream.
- the voice of the user is recorded in the second memory 83 of the module controller 78, or data regarding the voice pattern that changes when the user is in an emergency is generally stored. Accordingly, the module controller 78 may detect whether an abnormal symptom occurs to the user even without interworking with an external server.
- the module controller 78 is not limited, such as to detect abnormal signs by deriving an analysis result about the voice through the voice recognition server.
- control unit 85 controls the operation of the driving motor 100 to cause the cleaning robot to move to the point where the user is identified as located.
- the operation of the control unit 85 will be described later.
- the sensor module 84 may obtain the biometric information of the user.
- the sensor module 84 may be configured of at least one sensor capable of acquiring various information related to the user's living body such as a heartbeat, body temperature, and movement of the user, and a processor such as an MCU that controls the operation of the aforementioned sensors. .
- the sensor module 84 may obtain the biometric information of the user and transmit it to the control unit 85.
- the communication unit 52 may exchange data with an external device through a communication network. Since a general description of the configuration of the communication unit 52 is the same as described above, it will be omitted.
- the input unit 43 may receive various control commands related to the cleaning robot from the user.
- the input unit 43 may be provided in a dial form on one surface of the cleaning robot.
- the display 42 may serve as the input unit 43.
- the first memory 82 may be provided in the cleaning robot 11.
- the first memory 82 may be implemented as one memory together with the second memory 83, and the like.
- the first memory 82 stores data for determining the degree of user's state according to the biometric information.
- the first memory 82 may store biometric information of a user normally acquired through the sensor module 84.
- average state information for each age or age group may be stored in the first memory 82.
- the age information of the user input from the user through the input unit 43 may also be stored in the first memory 82.
- the first memory 82 stores various data necessary for determining the degree of the user's state.
- the first memory 82 stores a countermeasure according to a state degree.
- the coping scheme means a coping scheme using at least one of an operation of the cleaning robot 11 and an operation of at least one IoT apparatus connected through a home network according to the degree of the user.
- the controller 85 may control the overall operation of the cleaning robot 11.
- the controller 140 may control operations of all components of the cleaning robot 1, such as the display 42 and the speaker 63, as well as various modules built in the cleaning robot 11.
- the controller 85 may generate a control signal for controlling the components of the cleaning robot 11 and control the operations of the above-described components.
- the controller 85 may determine the degree of the user's state by using the biometric information of the user. For example, the controller 85 compares the user's biometric information stored in the first memory 82 with the user's biometric information acquired through the sensor module 84 and determines the degree of the user's state according to the degree of change. can do. As another example, the controller 85 may determine the degree of the user's state in accordance with the degree of change compared to the user's biometric information obtained.
- the first memory 82 may store state information for each age or age group.
- the cleaning robot 1l may receive the user's age through the input unit 43. Accordingly, the controller 85 may determine the degree of the user's state by comparing the state information for each age with the state information acquired through the sensor module.
- biometric history information input by the user may be stored in the first memory 82.
- biometric information refers to information about the disease, uniqueness, medical history, etc. that each user has.
- the user may input his or her bio history information through the input unit 42 or an application installed in the user terminal.
- the controller 85 may control a device in the cleaning robot 11 based on the corresponding countermeasure according to the determined degree.
- the control unit 85 may set a countermeasure step by step according to the degree of state, thereby providing a suitable countermeasure for the current user to avoid a crisis situation.
- the controller 85 may control the speaker 63 to transmit a voice asking the user what the current state is. Accordingly, the control unit 85 selects one of the coping methods stored in the first memory 82 based on the recognition result of the user's voice input through the voice recognition unit 62 and according to the selected coping method. The operation of the cleaning robot 11 can be controlled.
- the controller 85 may select a countermeasure according to the determination result regarding the degree of state, and control the operation of the cleaning robot 11 according to the selected countermeasure.
- the control unit 85 is linked with the home network server through the communication unit 52, various objects existing in the room The operation of the Internet device can be controlled.
- the control unit 85 turns on the air conditioner in association with the home network server, and transmits wind of a suitable temperature through the air conditioner.
- the control unit 85 may control the on / off of the lighting in association with the home network server, and help by notifying the outside of the emergency situation through the siren. You can request
- the controller 85 may attempt to connect to the phone through a preset contact, or may transmit an SMS message or an instant message.
- the controller 85 may perform all of the above-described operations, and may freely cope with the corresponding measures stored in the first memory 82.
- the countermeasures may be set in advance when designing the user or the cleaning robot 11.
- the countermeasure may be updated through the communication unit 52.
- the controller 85 may periodically acquire the biometric information of the user by controlling the sensor module 84 according to a preset period even when no abnormality is detected.
- the predetermined period may be set when the cleaning robot 11 is designed or by the user.
- the controller 85 may recognize the location of the user by using various devices capable of sensing the user, such as the imaging unit 74 or the 3D sensor 86, and control the driving motor to move near the point where the user is located. have. Accordingly, the controller 85 may determine the user's state from the biometric information obtained through the sensor module 84, and store the obtained biometric information and the obtained time information in the memory when the state is determined to be normal.
- control unit 85 may map the time information at the time of acquisition to the obtained biometric information to generate log data and store it in the memory.
- the log data is not only stored in the first memory 82, but may be stored in an external server.
- 47 is a flowchart illustrating an operation of a cleaning robot that operates according to a user's state determined based on the user's biometric information, according to an exemplary embodiment.
- the cleaning robot may detect an abnormal indication from a voice of a user (4600). For example, the cleaning robot compares a user's voice stored in a memory with a user's voice input through an array microphone, and detects an abnormal occurrence when a severe change in a voice waveform occurs. Or, on the average, information on the audio waveform when an abnormal symptom is generated is stored in a memory, and the cleaning robot can detect the occurrence of the abnormal symptom based on the above-described information stored in the memory.
- the memory may store information about a sudden voice, a voice waveform when calling for help. That is, the memory may store information on the main features of the audio waveform when an abnormality is generated from the user. In addition, the memory may store information on various features that can be derived from the voice, and is not limited to the shape of the waveform.
- the cleaning robot can grasp the direction information of the voice input through the microphone implemented in an array form. Accordingly, the cleaning robot can grasp the point where the user's voice is generated and move to the point detected using the driving motor and the wheel.
- the cleaning robot may acquire the biometric information of the user through the sensor module (4610).
- the biometric information includes various information for identifying a user's physical condition such as a user's heart rate, blood sugar, and body temperature.
- the sensor module is composed of a controller such as a heartbeat sensor that can grasp a user's body state, various known sensors such as an infrared sensor, and a processor that can control the aforementioned sensor.
- the cleaning robot may determine the degree of the user's state from the biometric information of the user acquired through the sensor module, and control the device in the cleaning robot based on a countermeasure based on the determination result (4620).
- a countermeasure corresponding to a determination result regarding the degree of the user's state may be stored in the memory or the external server of the cleaning robot. If stored in the memory, the cleaning robot may search for a countermeasure corresponding to the determination result in the memory and perform an operation based on the search result. As another example, the cleaning robot may access an external server through a communication network, search for a countermeasure corresponding to the determination result, and perform an operation based on the search result.
- some of the countermeasures corresponding to the determination result regarding the degree of the user's state may be stored in the memory and the other in the external server. Accordingly, the cleaning robot searches for the information stored in the memory, and there is no limitation such that the cleaning robot may access an external server if there is no countermeasure corresponding to the determination result.
- FIGS. 49 and 50 illustrate that a user according to another embodiment is in an emergency situation. At this time, it is a diagram for explaining a coping method through a home network.
- the cleaning robot may receive a search time of the user state from the user (4700).
- the cleaning robot may receive a search time through an input unit or a display implemented as a touch screen type.
- the cleaning robot may receive a search time from the user terminal through a communication network.
- the search time may be set as a specific time such as 1:00, 2:30, or may be set as a specific period such as every 30 minutes or every 2 hours.
- the cleaning robot may recognize the location of the user by using a device capable of sensing the user, such as a 3D sensor or an image unit, and move to the point where the user is located (4710). Accordingly, the cleaning robot may acquire the biometric information of the user through the sensor module and determine the degree of the user's state based on the obtained biometric information (4720).
- a device capable of sensing the user such as a 3D sensor or an image unit
- the cleaning robot may determine whether the user's state is abnormal based on the determination result (4730). If it is determined that the determination result is normal, the cleaning robot may store the acquired biometric information and time information when the biometric information is acquired (4740). At this time, the cleaning robot is not limited to being stored in the built-in memory or stored in an external server. The cleaning robot may store biometric information and time information as log data, but is not limited thereto.
- the cleaning robot may sound a question about what the current state is through the speaker (4750). Accordingly, the cleaning robot may change a countermeasure depending on whether the user responds to the question, that is, whether the user responds (4780). If there is a reaction from the user, the cleaning robot determines that the situation is not serious. Accordingly, when it is determined that there is a disease, the cleaning robot may perform a telephone connection or a message delivery with the attending physician to the preset contact point (4790).
- the cleaning robot may control the on / off of a light, operate a siren, or control an operation of an IoT apparatus existing in a room such as an air conditioner through a gateway server of a home network.
- an IoT apparatus existing in a room such as an air conditioner through a gateway server of a home network.
- the cleaning robot may recognize a user's voice through a voice recognition module and provide a service corresponding to the recognition result.
- the cleaning robot may inform the user of the action to be taken based on the determination result according to the degree of the user's state. For example, if it is determined that the user's body temperature is higher than the preset level and there is an abnormality, and the room temperature is determined to be an appropriate level, the cleaning robot may determine that the user has a heat due to a cold or the like. Accordingly, the cleaning robot can send out a countermeasure such as 'need ice cold' through the speaker.
- the cleaning robot can respond according to the degree of the user state (4800). If the user's condition is judged to be significantly different from the normal condition, the cleaning robot determines that the condition cannot be improved only by the operation of the cleaning robot itself and the plurality of IoT devices in the room, and may request help externally. have.
- the cleaning robot may attempt to make a telephone call or transmit a message using an internal memory or an emergency contact network stored in an external server.
- the cleaning robot may control the on / off of the light, turn on the siren, or transmit an emergency signal to the guard room to notify the outside of the emergency situation.
- the cleaning robot may operate in accordance with a predetermined countermeasure, such as interlocking with the gate server of the home network and releasing the door lock of the front door to facilitate access from the outside.
- a predetermined countermeasure such as interlocking with the gate server of the home network and releasing the door lock of the front door to facilitate access from the outside.
- the various countermeasures described above may be preset and stored in the memory of the cleaning robot or an external server, and the cleaning robot may perform operations according to the countermeasures using them.
- the memory of the cleaning robot or an external server may store the result of the emotional analysis according to the user's state and the corresponding measures. Accordingly, the cleaning robot may perform an operation of alleviating the symptoms of the user by using the result of analyzing the emotion of the user. For example, the cleaning robot may transmit at least one of the user's biometric information and the user's voice to an external server, and receive the analysis result and the response.
- the cleaning robot may adjust the temperature, illumination, humidity, etc. of the room by interworking with the IoT apparatus through the gateway server of the home network.
- the cleaning robot may operate according to various measures such as accessing a web server through a gateway server to support a streaming service of music suitable for the emotional state of the user, or conducting a dialogue with the user.
- the cleaning robot may make a telephone connection with a preset contact, for example, a guardian, or transmit a message to the guardian.
- the cleaning robot according to the embodiment may determine the user's state, not only provide a service based on the determination result, but may also provide a multimedia service.
- FIG. 51 is a block diagram illustrating a cleaning robot for providing a multimedia service according to an exemplary embodiment.
- FIG. 52 is a diagram for describing a case of obtaining image information by following a user through a vehicle according to an exemplary embodiment.
- 53 is a diagram for describing a case of detecting a possibility of occurrence of a risk of a user and processing a corresponding process according to an embodiment.
- FIG. 54 illustrates various embodiments of the projector through a beam projector.
- FIG. 55 is a diagram illustrating a case of displaying a kind of image
- FIG. 55 is a diagram for describing a cleaning robot that provides a path to a specific area or a room, according to an exemplary embodiment.
- the cleaning robot 1m includes an input unit 43, a communication unit 52, an image unit 74, a SLAM module 53, a memory 59, a voice recognition module 76, a vehicle 68, and a vehicle charging station 69. ), A display 42, a speaker 63, and a controller 87.
- at least one of the above-described components may be integrated in a system on chip embedded in the cleaning robot 1m, but there is not only one embedded system on chip, but a plurality of system on chips may be provided. It is not limited to the integration only.
- the input unit 43 may receive not only various control commands related to the cleaning robot 1m from the user, but also various control commands regarding various IoT apparatuses linked to the cleaning robot 1m.
- the input unit 43 may receive an object to obtain image information from the user.
- the display 42 when the display 42 is implemented as a touch screen type, the display 42 may serve as the input unit 43.
- the image information acquired through the image unit 74 the user may touch the object to be photographed or draw a boundary line.
- the object may be selected by the user terminal connected through the communication unit 52 in addition to the input unit 43.
- the communication unit 52 may exchange data with an IoT apparatus located indoors through a communication network or exchange data with an external user terminal.
- an application for providing various services related to the cleaning robot 1m may be installed in the user terminal.
- the user terminal may display image information acquired by the cleaning robot 1m through an application, and may provide a user interface (UI) for selecting an object to be followed by the user from the image information. have.
- UI user interface
- the user may select the object through the user interface, and the user terminal may transmit information about the selected object to the communication unit through the communication network.
- the cleaning robot 1m may grasp information about the selected object.
- the imaging unit 74 may acquire image information about the object. Since the general description of the image unit 74 is the same as described above, it will be omitted.
- the imaging unit 74 may track an object included in the image information by using a tracking algorithm.
- the tracking algorithm refers to a technique of tracking a specific object through image processing in image information.
- the tracking algorithm can be implemented through various known techniques.
- the imager 74 may be mounted on an aircraft or a rod so that the lens follows the movement of the object.
- the imaging unit 74 may be mounted on a vehicle. Accordingly, the aircraft may follow the user u to obtain image information and transmit the image information to the cleaning robot. Then, the communication unit 52 may deliver the image information to the user terminal through the gateway server of the home network.
- the cleaning robot (1m) may be provided with a voice recognition module 76.
- a voice recognition module 76 Detailed description of the voice recognition module 76 is the same as described above, and thus will be omitted.
- the voice recognition module 76 may receive a voice of the object selected by the user through the voice recognition unit 62. Accordingly, the voice recognition module 76 may derive a voice recognition result regarding the voice received through the memory or the voice recognition server as described above.
- the controller 87 may control the overall operation of the cleaning robot 1m. Since the general description of the controller 87 is the same as described above, it will be omitted. For example, the controller 87 may control the motor embedded in the vehicle or the rod through the control signal so that the imager 74 may follow the movement of the object.
- controller 87 may control the operation of the cleaning robot 1m to perform a process corresponding to a situation generated according to the movement of the object.
- the controller 87 may control the operation of the cleaning robot 1m to perform a process corresponding to a situation generated according to the movement of the object.
- the controller 87 may detect contamination generated according to the movement of the object from the image information, control the operation of the cleaning robot 1m, and perform appropriate cleaning.
- the beverage that was in the cup may spill as the subject moves.
- the controller 87 may perform cleaning on the contaminated area detected from the image information by using the wet cleaning head according to the wet cleaning mode.
- dust existing in a bookcase or the like may be poured onto the floor according to the movement of the object.
- the controller 87 may perform cleaning on the contaminated area detected from the image information using the dry cleaning head in the dry cleaning mode. That is, the cleaning robot 1m according to the embodiment may provide not only a multimedia function but also a cleaning function which is a function of the cleaning robot 1m itself.
- the subject may move to the danger zone.
- the controller 87 may control a device in the cleaning robot 1m to perform a warning action or a corresponding action to the infant.
- the controller 87 may transmit a warning message to the infant u through the speaker 63.
- the controller 87 may transmit a warning message through the communication unit 52 through a speaker installed in the IoT apparatus.
- the control unit 87 may perform a telephone connection or transmit a message to a user terminal located outside through the communication unit 52. Accordingly, the user can take a quick action by grasping an emergency situation.
- the controller 87 may perform an operation corresponding to a recognition result regarding the voice of the object being tracked in the image information. For example, if it is determined that the object desires to watch a specific program, the controller 87 may turn on the TV through a gateway server of the home network, and control to download the above-mentioned specific program by accessing a communication network.
- the controller 87 is a device capable of displaying various information such as a display 43 of the cleaning robot 1m, a display of the IoT apparatus, or a beam projector of the cleaning robot 1m as shown in FIG. Through the above program can be displayed.
- the controller 87 supports a sound source streaming service through the speaker 63 of the cleaning robot 1m or the speaker of the IoT apparatus, or the TTS (Text To Speech) built in the cleaning robot 1m or the IoT apparatus.
- the function can support the book reading service. That is, the cleaning robot 1m according to the embodiment may not only support a cleaning function but may provide various multimedia services.
- the object may be a visually impaired person, as shown in FIG. 55, requesting to inform the route to the toilet through voice.
- the controller 87 may combine the map information determined by the SLAM module 53 and the position of the following object through the image unit 74 to provide a path to the toilet through the speaker 63.
- the process according to each situation is implemented by a program or algorithm and stored in the memory 59 or the external server of the cleaning robot 1m.
- 56 is a flowchart illustrating an operation of a cleaning robot that acquires image information by following an object according to an exemplary embodiment.
- the cleaning robot may receive information about the object to be followed through the input unit or the communication network from the user (5500). For example, the cleaning robot may display image information through the display of the cleaning robot or through a display of a user terminal located outside connected through a communication network or an IoT apparatus. Accordingly, the user may select an object to be followed through various methods on the display. At this time, the cleaning robot may receive the history information from the user, such as the age or history of the selected object through the input unit or the communication network.
- the cleaning robot may store it so that the user does not need to input specific information when selecting the object again.
- an image of an object may be stored in the cleaning robot in advance, and the user may set a name, a name, etc. of the object, and may also set history information on the object.
- the cleaning robot may display the object list through at least one of the aforementioned displays. Then, the user can easily select the object by selecting the object to be followed from the object list without selecting the object in the image information.
- the cleaning robot may acquire image information including the object by following the selected object (5510).
- the cleaning robot may adjust the left / right, up / down, and height of the image unit, for example, the camera, to follow the movement of the object.
- the cleaning robot may transmit image information obtained by following an object to a user terminal through a communication network. Accordingly, the user may receive image information about the object even when located outside. In particular, when the subject is an infant, even if the user is located outside, the accident may be prevented by checking the infant.
- the cleaning robot may acquire image information about the infant by controlling the vehicle equipped with the image unit to track the infant.
- the cleaning robot may receive the image information from the vehicle, and transmit the image information to the user terminal. Accordingly, the user can check the image information through the user terminal, thereby preparing for any risks. It is also possible to relieve anxiety when the infant is left indoors alone.
- the cleaning robot may detect the movement of the object from the image information and perform a coping operation with respect to a situation generated based on the movement (5520). For example, when it is determined that a problem may occur according to the movement of the object based on the history information of the object, the cleaning robot may operate according to a corresponding countermeasure.
- the coping method may vary according to the history information of the object. For example, if the object is an infant of N years or younger, the cleaning robot determines that the object is moving to the kitchen using the indoor map information stored in the memory, and sends a warning voice through the speaker and connects to a preset contact. , A message transfer, and an external help request through the IoT apparatus.
- N may be set to 13 years old or less, but is not limited thereto.
- the cleaning robot may transmit the voice of the infant recorded through the image information and the voice recognition module to a user terminal such as a smartphone or a wearable device.
- the cleaning robot is an Internet of Things device, there is no limitation, such as to control the on / off of the lamp to notify the danger to the outside.
- the cleaning robot determines that the movement of the object is not normal from the image information.
- An operation such as an external help request can be performed. That is, the cleaning robot may control a device in the cleaning robot according to various countermeasures based on at least one of the history information and the image information, and take appropriate measures on the object.
- FIG. 57 is a flowchart illustrating an operation of a cleaning robot that provides a safety service in anticipation of a risk according to a movement of an object, according to an exemplary embodiment.
- the cleaning robot may determine a possibility of occurrence of danger according to the movement of the object (5620). If it is determined that there is no possibility of danger, the cleaning robot may switch to the standby state capable of recognizing the voice of the object (5630). Accordingly, when the voice command is uttered from the object, the cleaning robot may provide a service corresponding to the recognition result thereof. For example, the cleaning robot can provide various multimedia services as described above.
- the cleaning robot may determine that there is a possibility of danger if it cannot be seen in a normal state, such as a change in the movement of the subject is not constant.
- the cleaning robot may determine that there is a possibility of danger when the infant approaches a dangerous area such as a kitchen or a window as described above.
- the cleaning robot may take warning and countermeasure (5650).
- the cleaning robot may perform various countermeasures such as sending a warning message through a speaker or performing a telephone connection or message transmission through a preset contact.
- first and second terms including ordinal numbers such as “first” and “second” as used herein may be used to describe various components, but the components are not limited by the terms. It is used only to distinguish one component from another.
- first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
- the term “and / or” includes any combination of a plurality of related items or any of a plurality of related items.
- terms such as “unit”, “ ⁇ group”, “ ⁇ block”, “ ⁇ member”, “ ⁇ module”, etc. which are used throughout the present specification, are at least one. It may mean a unit for processing a function or an operation of. For example, it may mean hardware such as software, FPGA, or ASIC.
- the terms “ ⁇ part”, “ ⁇ group”, “ ⁇ block”, “ ⁇ absence”, “ ⁇ module” are not limited to software or hardware, and “ ⁇ part”, “ ⁇ group”, “ ⁇ Blocks ”,“ ⁇ members ”,“ ⁇ modules ”, etc. may be components stored in an accessible storage medium and performed by one or more processors.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Human Computer Interaction (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electric Vacuum Cleaner (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
Description
Claims (15)
- 서로 다른 기능을 지원하는 적어도 하나의 모듈이 통합된 모듈러; 및상기 모듈러의 동작을 제어하여, 상기 청소 로봇 내 기기 및 사물인터넷 기기 중 적어도 하나를 제어하는 제어부;를 포함하는 청소 로봇,
- 제1항에 있어서,상기 모듈러는,로봇 팔을 제어하여 상기 로봇 팔에 청소헤드를 장착하는 그리퍼 모듈을 포함하고,상기 제어부는, 상기 그리퍼 모듈의 동작을 제어하여, 상기 로봇 팔에 장착된 청소헤드를 이용하여 청소 수행을 제어하는 청소 로봇.
- 제1항에 있어서,상기 모듈러는,통신망을 통해 적어도 하나의 다른 청소 로봇과 연결을 지원하는 통신 모듈을 포함하고,상기 제어부는, 상기 적어도 하나의 다른 청소 로봇이 지원하는 지원 사양, 청소 영역의 크기, 및 형태 중 적어도 하나를 기초로 상기 통신 모듈의 동작을 제어하여 공동 청소 수행을 제어하는 청소 로봇.
- 제1항에 있어서,상기 모듈러는,실내에 존재하는 사용자를 감지하는 인식 모듈을 포함하고,상기 제어부는,상기 인식 모듈을 통한 감지결과, 및 사물인터넷 기기를 통한 감지결과 중 적어도 하나에 기초하여, 청소 수행을 제어하는 청소 로봇.
- 제1항에 있어서,상기 모듈러는,실내 환경정보를 획득하는 센서 모듈을 포함하고,상기 제어부는,상기 센서 모듈을 통해 획득한 실내 환경정보와 사용자가 설정한 실내 환경에 관한 희망정보를 기초로, 상기 청소 로봇 및 사물인터넷 기기 중 적어도 하나를 제어하여 실내 환경을 조절하는 청소 로봇.
- 제1항에 있어서,상기 모듈러는,통신망을 통해 외부 기기와의 연결을 지원하는 통신 모듈, 및 영상정보를 획득하는 영상부를 포함하고,상기 제어부는,상기 통신 모듈을 통해 원격접속 요청을 수신하면, 상기 영상부를 제어하여 영상정보를 획득하는 청소 로봇.
- 청소헤드가 장착되는 로봇 팔; 및복수의 청소헤드 중에서 청소 모드에 대응하는 청소헤드를 결정하고, 상기 로봇 팔과 상기 결정한 청소헤드 간의 장착을 제어하고, 상기 장착된 청소헤드를 이용하여 청소 수행을 제어하는 제어부;를 포함하는 청소 로봇.
- 제1항에 있어서,상기 로봇 팔의 내부에는,상기 청소헤드로 물을 공급하는 물 공급 배관, 및 먼지를 흡입하는 흡입 유로가 마련되는 청소 로봇.
- 제1항에 있어서,상기 로봇 팔의 내부에는,상기 청소헤드와의 결합을 유도하는 도킹부, 및 상기 청소헤드를 고정하는 전자석이 마련되는 청소 로봇.
- 제1항에 있어서,상기 복수의 청소헤드가 보관되는 청소헤드 보관함;을 더 포함하는 청소 로봇.
- 제10항에 있어서,상기 청소헤드 보관함은,상기 청소 로봇의 스테이션에 마련되거나 또는 미리 설정해놓은 위치에 마련되는 청소 로봇.
- 제10항에 있어서,상기 청소헤드 보관함에는,상기 복수의 청소헤드가 미리 설정된 위치에 보관되는 청소 로봇.
- 제10항에 있어서,상기 제어부는,상기 청소헤드 보관함에 장착된 적외선(Infrared Ray, IR) 센서를 이용하여 상기 청소헤드 보관함의 위치를 식별하는 청소 로봇.
- 적어도 하나의 다른 청소 로봇과의 통신 연결을 수행하는 통신 모듈;상기 적어도 하나의 다른 청소 로봇이 지원하는 지원 사양, 청소 영역의 크기, 및 형태 중 적어도 하나를 기초로 상기 적어도 하나의 다른 청소 로봇과의 공동청소 방식을 결정하는 결정부; 및상기 결정한 공동 청소 방식을 기초로 상기 청소 로봇 및 상기 적어도 하나의 다른 청소 로봇을 제어하는 제어부;를 포함하는 청소 로봇.
- 제14항에 있어서,상기 결정부는,상기 적어도 하나의 다른 청소 로봇과 동일한 영역을 함께 청소하는 군집 공동청소 방식, 상기 적어 하나의 다른 청소 로봇과 청소 영역을 구역 별로 나누어 청소하는 구역별 공동청소 방식, 및 상기 군집 공동청소 방식과 상기 구역별 공동청소 방식이 혼합된 다중패턴 공동청소 방식 중 어느 하나의 공동청소방식을 결정하는 청소 로봇.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680070436.5A CN108366707B (zh) | 2015-10-27 | 2016-10-20 | 清洁机器人及其控制方法 |
AU2016346447A AU2016346447B2 (en) | 2015-10-27 | 2016-10-20 | Cleaning robot and method for controlling same |
US15/771,350 US11019972B2 (en) | 2015-10-27 | 2016-10-20 | Cleaning robot and method for controlling same |
EP16860138.3A EP3342324B1 (en) | 2015-10-27 | 2016-10-20 | Cleaning robot with a controller for controlling a quiet mode |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150149359A KR102521493B1 (ko) | 2015-10-27 | 2015-10-27 | 청소 로봇 및 그 제어방법 |
KR10-2015-0149359 | 2015-10-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017073955A1 true WO2017073955A1 (ko) | 2017-05-04 |
Family
ID=58630898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/011823 WO2017073955A1 (ko) | 2015-10-27 | 2016-10-20 | 청소 로봇 및 그 제어방법 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11019972B2 (ko) |
EP (1) | EP3342324B1 (ko) |
KR (1) | KR102521493B1 (ko) |
CN (1) | CN108366707B (ko) |
AU (1) | AU2016346447B2 (ko) |
WO (1) | WO2017073955A1 (ko) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108968805A (zh) * | 2017-06-05 | 2018-12-11 | 碧洁家庭护理有限公司 | 自主地板清洁*** |
EP3415070A1 (de) * | 2017-06-16 | 2018-12-19 | Vorwerk & Co. Interholding GmbH | System mit mindestens zwei bodenbearbeitungseinrichtungen |
EP3416019A1 (de) * | 2017-06-16 | 2018-12-19 | Vorwerk & Co. Interholding GmbH | System mit mindestens zwei sich selbsttätig fortbewegenden bodenbearbeitungsgeräten |
CN109079772A (zh) * | 2017-06-14 | 2018-12-25 | 深圳乐动机器人有限公司 | 机器人及机器人*** |
CN110780779A (zh) * | 2019-09-25 | 2020-02-11 | 北京爱接力科技发展有限公司 | 一种机器人服务方法、装置和机器人终端 |
US20200152073A1 (en) * | 2018-11-08 | 2020-05-14 | Hyundai Motor Company | Service robot and method for operating thereof |
WO2021068699A1 (zh) * | 2019-10-08 | 2021-04-15 | 炬星科技(深圳)有限公司 | 机器人控制方法、电子设备及计算机可读存储介质 |
EP3690597A4 (en) * | 2017-09-25 | 2021-06-09 | Positec Power Tools (Suzhou) Co., Ltd | POWER TOOL SYSTEM, CHARGER, POWER TOOL AND ASSOCIATED VOICE COMMAND PROCESS, AND AUTOMATIC WORK SYSTEM, CHARGING STATION, AUTOMATIC MOVING DEVICE AND ASSOCIATED VOICE COMMAND PROCESS |
EP3727122A4 (en) * | 2017-12-18 | 2021-08-11 | LG Electronics Inc. | ROBOT CLEANERS AND THEIR CONTROL PROCESS |
US11308788B2 (en) | 2019-02-06 | 2022-04-19 | Ecolab Usa Inc. | Hygiene management for reducing illnesses and infections caused by ineffective hygiene practices |
EP3508938B1 (en) * | 2018-01-05 | 2022-09-28 | iRobot Corporation | Mobile cleaning robot teaming and persistent mapping |
US11656082B1 (en) * | 2017-10-17 | 2023-05-23 | AI Incorporated | Method for constructing a map while performing work |
Families Citing this family (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10866783B2 (en) * | 2011-08-21 | 2020-12-15 | Transenterix Europe S.A.R.L. | Vocally activated surgical control system |
US11561762B2 (en) * | 2011-08-21 | 2023-01-24 | Asensus Surgical Europe S.A.R.L. | Vocally actuated surgical control system |
US20170364073A1 (en) * | 2016-06-21 | 2017-12-21 | Keith Alan Guy | Modular Robotic System |
JP6680660B2 (ja) * | 2016-10-18 | 2020-04-15 | ファナック株式会社 | ワイヤ放電加工機 |
KR101952414B1 (ko) * | 2016-10-25 | 2019-02-26 | 엘지전자 주식회사 | 청소기 및 그 제어방법 |
DE102017207341A1 (de) * | 2017-05-02 | 2018-11-08 | Henkel Ag & Co. Kgaa | Verfahren zur Steuerung von Reinigungsgeräten |
KR102412847B1 (ko) * | 2017-06-30 | 2022-06-24 | 엘지전자 주식회사 | 로봇 청소기 및 로봇 청소 시스템 |
KR102361315B1 (ko) * | 2017-09-22 | 2022-02-09 | 엘지전자 주식회사 | 이동 로봇 및 이동 로봇의 제어방법 |
KR102219801B1 (ko) * | 2017-09-22 | 2021-02-23 | 엘지전자 주식회사 | 인공지능을 이용한 이동 로봇 및 이동 로봇의 제어방법 |
KR101992609B1 (ko) * | 2017-10-17 | 2019-06-25 | 이현수 | 다기능 청소기 |
US10612929B2 (en) * | 2017-10-17 | 2020-04-07 | AI Incorporated | Discovering and plotting the boundary of an enclosure |
JP7093170B2 (ja) * | 2017-10-20 | 2022-06-29 | 東芝ライフスタイル株式会社 | 衣類処理装置 |
EP3479667A1 (en) * | 2017-11-02 | 2019-05-08 | Melos GmbH | Artificial turf maintenance robot |
CN108344414A (zh) * | 2017-12-29 | 2018-07-31 | 中兴通讯股份有限公司 | 一种地图构建、导航方法及装置、*** |
KR102489806B1 (ko) * | 2018-01-03 | 2023-01-19 | 삼성전자주식회사 | 청소용 이동장치, 협업청소 시스템 및 그 제어방법 |
KR102385263B1 (ko) | 2018-01-04 | 2022-04-12 | 삼성전자주식회사 | 이동형 홈 로봇 및 이동형 홈 로봇의 제어 방법 |
KR102015030B1 (ko) * | 2018-01-09 | 2019-08-27 | 엘지전자 주식회사 | 이동 로봇 및 이동 로봇의 제어방법 |
KR20190088824A (ko) * | 2018-01-19 | 2019-07-29 | 삼성전자주식회사 | 로봇 청소기 및 그 제어 방법 |
US20190246858A1 (en) * | 2018-02-13 | 2019-08-15 | Nir Karasikov | Cleaning robot with arm and tool receptacles |
KR102077669B1 (ko) * | 2018-03-06 | 2020-02-14 | 네이버랩스 주식회사 | 사용자의 터치 인터랙션과 연관된 센싱 데이터를 처리하는 방법 및 장치 |
KR102152731B1 (ko) * | 2018-03-16 | 2020-09-07 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
WO2019212240A1 (en) * | 2018-05-04 | 2019-11-07 | Lg Electronics Inc. | A plurality of robot cleaner and a controlling method for the same |
KR102100476B1 (ko) | 2018-05-04 | 2020-05-26 | 엘지전자 주식회사 | 복수의 이동 로봇 및 그 제어방법 |
JP7281707B2 (ja) * | 2018-07-06 | 2023-05-26 | パナソニックIpマネジメント株式会社 | 移動ロボット、及び、制御方法 |
KR102612827B1 (ko) * | 2018-07-23 | 2023-12-11 | 엘지전자 주식회사 | 인공지능 이동 로봇의 제어 방법 |
KR102612822B1 (ko) * | 2018-07-23 | 2023-12-11 | 엘지전자 주식회사 | 인공지능 이동 로봇의 제어 방법 |
JP2021177265A (ja) * | 2018-08-01 | 2021-11-11 | ソニーグループ株式会社 | 移動体及び制御方法 |
JP6823018B2 (ja) * | 2018-08-03 | 2021-01-27 | ファナック株式会社 | 協調動作支援装置 |
WO2020028981A1 (en) * | 2018-08-07 | 2020-02-13 | Anram Holdings | Remote cleaning quality management systems and related methods of use |
US11303707B1 (en) * | 2018-08-14 | 2022-04-12 | Joelle Adler | Internet of things sanitization system and method of operation through a blockchain network |
DE102018120641A1 (de) * | 2018-08-23 | 2020-02-27 | Vorwerk & Co. Interholding Gmbh | System, Saugrohr und Verfahren zum Entriegeln eines Anschlusses zwischen einem Staubsaugergrundkörper und einer Saugdüse oder zum Entriegeln eines Anschlusses zwischen einem Saugrohr und einer Saugdüse |
US11259676B2 (en) * | 2018-08-30 | 2022-03-01 | Sensirion Ag | Vacuum cleaner device |
JP6961557B2 (ja) * | 2018-09-18 | 2021-11-05 | 株式会社東芝 | 物品姿勢変更装置および物品姿勢変更方法 |
US20200100639A1 (en) * | 2018-10-01 | 2020-04-02 | International Business Machines Corporation | Robotic vacuum cleaners |
US20210339401A1 (en) * | 2018-10-03 | 2021-11-04 | Sony Group Corporation | Mobile unit control device, mobile unit control method, and program |
JP7225659B2 (ja) * | 2018-10-11 | 2023-02-21 | ソニーグループ株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
KR200491772Y1 (ko) * | 2018-11-02 | 2020-06-02 | 이유석 | 인공지능 스피커 및/또는 적외선 송신 모듈이 장착 가능한 로봇 청소기 |
KR102168795B1 (ko) * | 2018-11-16 | 2020-10-22 | 메타로보틱스 주식회사 | 가축전염병 예방을 위한 무인주행 소독 로봇시스템 |
USD938115S1 (en) | 2018-11-30 | 2021-12-07 | Irobot Corporation | Autonomous floor cleaning robot |
CN115429159A (zh) * | 2018-12-05 | 2022-12-06 | 科沃斯机器人股份有限公司 | 机器人作业方法、机器人和存储介质 |
CN111319046B (zh) * | 2018-12-13 | 2023-07-07 | 深圳小牛黑科技有限公司 | 一种机器人香薰及控制方法 |
KR102148010B1 (ko) * | 2018-12-20 | 2020-08-26 | 동의대학교 산학협력단 | 클라우드 서버 기반의 무인이동체 IoT 서비스를 위한 장치 및 방법 |
KR102368856B1 (ko) * | 2018-12-31 | 2022-03-02 | 주식회사 도구공간 | 오브젝트 탐지 방법 및 장치 |
KR102234641B1 (ko) * | 2019-01-17 | 2021-03-31 | 엘지전자 주식회사 | 이동 로봇 및 복수의 이동 로봇의 제어방법 |
CN109700385A (zh) * | 2019-01-28 | 2019-05-03 | 珠海格力电器股份有限公司 | 基于微波雷达的清扫设备的控制方法以及装置 |
KR102198439B1 (ko) * | 2019-01-29 | 2021-01-05 | 성창경 | 코딩 로봇 작동 제어 유닛 및 상기 코딩 로봇 작동 제어 유닛의 제어 방법 |
JP2020121375A (ja) * | 2019-01-30 | 2020-08-13 | 株式会社Preferred Networks | 制御装置、制御対象装置、制御方法及びプログラム |
KR102267690B1 (ko) * | 2019-02-20 | 2021-06-22 | 엘지전자 주식회사 | 복수의 자율주행 이동 로봇 |
KR20200115696A (ko) | 2019-03-07 | 2020-10-08 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
DE102019203797B4 (de) * | 2019-03-20 | 2021-01-21 | Edag Engineering Gmbh | Antriebsvorrichtung für ein Verkehrsmittelwechselsystem und Verkehrsmittelwechselsystem |
KR102217540B1 (ko) * | 2019-05-07 | 2021-02-19 | 엘지전자 주식회사 | 이동 로봇 및 복수의 이동 로봇의 제어방법 |
WO2020241933A1 (ko) | 2019-05-30 | 2020-12-03 | 엘지전자 주식회사 | 슬레이브 로봇을 제어하는 마스터 로봇 및 그의 구동 방법 |
JP7221817B2 (ja) * | 2019-07-01 | 2023-02-14 | 東芝ライフスタイル株式会社 | 自律型掃除機 |
KR102317723B1 (ko) * | 2019-07-04 | 2021-10-25 | 엘지전자 주식회사 | 이동로봇 및 그의 제어방법 |
KR102306437B1 (ko) * | 2019-07-05 | 2021-09-28 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
CN110464263A (zh) * | 2019-08-21 | 2019-11-19 | 深圳乐动机器人有限公司 | 一种控制机器人清洁的方法及机器人 |
JP6947205B2 (ja) * | 2019-08-26 | 2021-10-13 | ダイキン工業株式会社 | 空気調和システム、および、空気調和システムを用いた情報提供方法 |
US11480431B1 (en) * | 2019-08-27 | 2022-10-25 | Alarm.Com Incorporated | Lighting adaptive navigation |
JP7191794B2 (ja) * | 2019-09-04 | 2022-12-19 | 本田技研工業株式会社 | 自律除雪機 |
WO2021044953A1 (ja) * | 2019-09-06 | 2021-03-11 | ソニー株式会社 | 情報処理システム及び情報処理方法 |
CN210836195U (zh) * | 2019-09-20 | 2020-06-23 | 宁波八益集团有限公司 | 档案库房机器人管理*** |
KR102330901B1 (ko) * | 2019-09-27 | 2021-11-26 | 엘지전자 주식회사 | 이동로봇 |
KR102295824B1 (ko) | 2019-12-06 | 2021-08-31 | 엘지전자 주식회사 | 잔디깎기 로봇의 지도 생성방법 |
CN112971611A (zh) * | 2019-12-12 | 2021-06-18 | 苏州宝时得电动工具有限公司 | 用于收纳地面清理机器人的基站和地面清理机器人*** |
KR20210099470A (ko) * | 2020-02-04 | 2021-08-12 | 엘지전자 주식회사 | 청소기 |
KR20210099787A (ko) * | 2020-02-05 | 2021-08-13 | 삼성전자주식회사 | 안전 서비스를 제공하는 전자 장치 및 그 방법 |
US11561102B1 (en) | 2020-04-17 | 2023-01-24 | AI Incorporated | Discovering and plotting the boundary of an enclosure |
KR102421519B1 (ko) * | 2020-05-26 | 2022-07-15 | 엘지전자 주식회사 | 이동 로봇 시스템 및 이동 로봇 시스템의 경계 정보 생성 방법 |
CN111672192B (zh) * | 2020-05-28 | 2021-11-02 | 佛山市百斯特电器科技有限公司 | 洗涤设备的清洁提醒方法及装置 |
AU2021301912A1 (en) * | 2020-07-01 | 2023-02-09 | Lg Electronics Inc. | Robot cleaner, robot cleaner system including same, and method for controlling robot cleaner system |
KR20220003250A (ko) * | 2020-07-01 | 2022-01-10 | 엘지전자 주식회사 | 로봇 청소기와 이를 구비하는 로봇 청소기 시스템 |
AU2021299590A1 (en) * | 2020-07-01 | 2023-02-09 | Lg Electronics Inc. | Robot cleaner, system for controlling robot cleaner, and method for controlling robot cleaner |
ES2893051A1 (es) * | 2020-07-28 | 2022-02-07 | Cecotec Res And Development | Estacion de carga para robot aspirador |
GB2604412A (en) * | 2020-08-19 | 2022-09-07 | Zhejiang Ubp New Energy Tech Co Ltd | Cleaning system and cleaning robot |
CN111973087B (zh) * | 2020-08-19 | 2021-11-30 | 浙江明鹏新能源科技有限公司 | 清扫***及清洁机器人 |
GB2600734B (en) | 2020-11-06 | 2022-12-07 | Dyson Technology Ltd | Robotic surface treating system |
GB2600735B (en) | 2020-11-06 | 2023-07-19 | Dyson Technology Ltd | Robotic surface treating system |
GB2600733B (en) | 2020-11-06 | 2023-04-19 | Dyson Technology Ltd | Robotic vacuum cleaning system |
GB2600732B (en) | 2020-11-06 | 2023-04-19 | Dyson Technology Ltd | Robotic vacuum cleaning system |
GB2600731B (en) | 2020-11-06 | 2023-04-19 | Dyson Technology Ltd | Robotic vacuum cleaning system |
GB2600736B (en) | 2020-11-06 | 2023-04-19 | Dyson Technology Ltd | Robotic surface treating system |
GB2600739B (en) | 2020-11-06 | 2022-12-07 | Dyson Technology Ltd | Robotic surface treating system |
GB2600737B (en) | 2020-11-06 | 2023-07-12 | Dyson Technology Ltd | Robotic surface treating system |
GB2600729B (en) | 2020-11-06 | 2023-07-12 | Dyson Technology Ltd | Robotic vacuum cleaning system |
GB2600738B (en) | 2020-11-06 | 2023-07-12 | Dyson Technology Ltd | Robotic surface treating system |
GB2600730B (en) | 2020-11-06 | 2023-04-19 | Dyson Technology Ltd | Robotic vacuum cleaning system |
US20220142422A1 (en) * | 2020-11-06 | 2022-05-12 | Mark Jeffery Giarritta | Automatic multi-attachment changing station |
KR102478747B1 (ko) * | 2020-11-20 | 2022-12-20 | 한국광기술원 | 드론 연동 방식 3차원 공간 청소 로봇 및 그 구동 방법 |
JP7452409B2 (ja) * | 2020-12-25 | 2024-03-19 | トヨタ自動車株式会社 | タスクシステム、制御方法及び制御プログラム |
DE102021102655A1 (de) * | 2021-02-04 | 2022-08-04 | Vorwerk & Co. Interholding Gesellschaft mit beschränkter Haftung | System zum Reinigen einer Umgebung |
CN113459123B (zh) * | 2021-07-14 | 2022-11-18 | 浙江同济科技职业学院 | 一种便于移动控制的家居看护机器人 |
CN113633224A (zh) * | 2021-08-06 | 2021-11-12 | 珠海一微半导体股份有限公司 | 可重构的清洁机器人***及控制方法 |
CN113757976B (zh) * | 2021-09-14 | 2022-11-18 | 重庆海尔空调器有限公司 | 空调物联控制方法、控制***、电子设备和存储介质 |
CN114052566B (zh) * | 2021-11-04 | 2023-09-26 | 青岛海尔空调器有限总公司 | 用于智能移动设备的控制方法及装置、***、移动设备 |
CN113995343B (zh) * | 2021-11-15 | 2022-10-14 | 上海景吾智能科技有限公司 | 清洁类机器人电动夹爪结构 |
CN114296442B (zh) * | 2021-11-15 | 2023-08-11 | 珠海格力电器股份有限公司 | 一种床被除螨机器人的控制方法 |
KR20230073598A (ko) | 2021-11-19 | 2023-05-26 | 주식회사 아이테크 | 열풍 건조식 물기제거 청소기 |
WO2023110103A1 (en) * | 2021-12-16 | 2023-06-22 | Aktiebolaget Electrolux | Robotic cleaning device with controllable arm |
WO2023125698A1 (zh) * | 2021-12-28 | 2023-07-06 | 美智纵横科技有限责任公司 | 清洁设备及其控制方法和控制装置 |
CN115486763A (zh) * | 2022-08-26 | 2022-12-20 | 珠海格力电器股份有限公司 | 一种扫地机器人路线规划方法、装置、扫地机器人及*** |
KR20240052418A (ko) | 2022-10-14 | 2024-04-23 | 엘지전자 주식회사 | 로봇 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070096272A (ko) * | 2006-03-23 | 2007-10-02 | 김재윤 | 실내 환경개선 및 자동제어 로봇시스템 |
US20100256812A1 (en) * | 2008-08-08 | 2010-10-07 | Yuko Tsusaka | Control device and control method for cleaner, cleaner, control program for cleaner, and integrated electronic circuit |
KR101349671B1 (ko) * | 2012-08-24 | 2014-01-16 | 전자부품연구원 | 작업 영역 판별을 이용한 건물 외벽에 대한 관리 작업 수행 장치 및 방법 |
KR20150026528A (ko) * | 2013-09-03 | 2015-03-11 | 엘지전자 주식회사 | 로봇 청소기, 이동 단말기 및 그 동작방법 |
KR20150075639A (ko) * | 2013-12-26 | 2015-07-06 | 주식회사 라스테크 | 협동로봇 제어시스템 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005211498A (ja) * | 2004-01-30 | 2005-08-11 | Funai Electric Co Ltd | 自走式掃除機 |
KR100664053B1 (ko) * | 2004-09-23 | 2007-01-03 | 엘지전자 주식회사 | 로봇청소기의 청소툴 자동 교환 시스템 및 방법 |
EP2027806A1 (en) * | 2006-04-04 | 2009-02-25 | Samsung Electronics Co., Ltd. | Robot cleaner system having robot cleaner and docking station |
KR20070101002A (ko) * | 2006-04-10 | 2007-10-16 | 이바도 | 위성 방식의 청소로봇 시스템 |
CN202053254U (zh) * | 2010-11-25 | 2011-11-30 | 上海领派机电科技有限公司 | 一种服务型机械手 |
CN202397386U (zh) | 2011-08-26 | 2012-08-29 | 上海电机学院 | 智能吸盘式擦洗机 |
JP5968627B2 (ja) * | 2012-01-17 | 2016-08-10 | シャープ株式会社 | 掃除機、制御プログラム、および該制御プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2015535373A (ja) | 2012-10-05 | 2015-12-10 | アイロボット コーポレイション | 移動ロボットを含むドッキングステーション姿勢を決定するためのロボット管理システムとこれを用いる方法 |
CN203074576U (zh) * | 2012-12-19 | 2013-07-24 | 南昌航空大学 | 多功能清洁机器人 |
CN109965778B (zh) | 2013-01-18 | 2022-08-16 | 艾罗伯特公司 | 包括移动机器人的环境管理***以及其使用方法 |
CN104460663A (zh) * | 2013-09-23 | 2015-03-25 | 科沃斯机器人科技(苏州)有限公司 | 智能手机控制清扫机器人的方法 |
CN104714411A (zh) * | 2013-12-13 | 2015-06-17 | 青岛海尔机器人有限公司 | 一种智能家居机器人 |
CN204685531U (zh) * | 2015-04-09 | 2015-10-07 | 徐州德坤电气科技有限公司 | 智能自动清洁单元清洁机械手 |
KR101654012B1 (ko) * | 2015-05-20 | 2016-09-09 | 주식회사 파인로보틱스 | 걸레 로봇 청소기 |
JP2016220174A (ja) * | 2015-05-26 | 2016-12-22 | 株式会社東芝 | 家電制御方法及び家電制御装置 |
KR101706966B1 (ko) * | 2015-07-17 | 2017-02-15 | 엘지전자 주식회사 | 로봇 청소기 |
EP3695766A4 (en) * | 2017-10-13 | 2021-05-05 | Chiba Institute of Technology | SELF-PROPELLED VACUUM CLEANER |
-
2015
- 2015-10-27 KR KR1020150149359A patent/KR102521493B1/ko active IP Right Grant
-
2016
- 2016-10-20 US US15/771,350 patent/US11019972B2/en active Active
- 2016-10-20 WO PCT/KR2016/011823 patent/WO2017073955A1/ko active Application Filing
- 2016-10-20 EP EP16860138.3A patent/EP3342324B1/en active Active
- 2016-10-20 AU AU2016346447A patent/AU2016346447B2/en active Active
- 2016-10-20 CN CN201680070436.5A patent/CN108366707B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070096272A (ko) * | 2006-03-23 | 2007-10-02 | 김재윤 | 실내 환경개선 및 자동제어 로봇시스템 |
US20100256812A1 (en) * | 2008-08-08 | 2010-10-07 | Yuko Tsusaka | Control device and control method for cleaner, cleaner, control program for cleaner, and integrated electronic circuit |
KR101349671B1 (ko) * | 2012-08-24 | 2014-01-16 | 전자부품연구원 | 작업 영역 판별을 이용한 건물 외벽에 대한 관리 작업 수행 장치 및 방법 |
KR20150026528A (ko) * | 2013-09-03 | 2015-03-11 | 엘지전자 주식회사 | 로봇 청소기, 이동 단말기 및 그 동작방법 |
KR20150075639A (ko) * | 2013-12-26 | 2015-07-06 | 주식회사 라스테크 | 협동로봇 제어시스템 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3342324A4 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2018203588B2 (en) * | 2017-06-05 | 2019-11-14 | Bissell Inc. | Autonomous floor cleaning system |
CN108968805B (zh) * | 2017-06-05 | 2021-01-05 | 碧洁家庭护理有限公司 | 自主地板清洁*** |
AU2019257434B2 (en) * | 2017-06-05 | 2020-10-15 | Bissell Homecare, Inc. | Autonomous floor cleaning system |
EP3653098A1 (en) * | 2017-06-05 | 2020-05-20 | Bissell Homecare, Inc. | Autonomous floor cleaning system |
US10602898B2 (en) | 2017-06-05 | 2020-03-31 | Bissell Homecare, Inc. | Autonomous floor cleaning system |
CN108968805A (zh) * | 2017-06-05 | 2018-12-11 | 碧洁家庭护理有限公司 | 自主地板清洁*** |
EP3437537A1 (en) * | 2017-06-05 | 2019-02-06 | Bissell Homecare, Inc. | Autonomous floor cleaning system |
CN109079772A (zh) * | 2017-06-14 | 2018-12-25 | 深圳乐动机器人有限公司 | 机器人及机器人*** |
CN109124458A (zh) * | 2017-06-16 | 2019-01-04 | 德国福维克控股公司 | 具有至少两个自主行进的地面处理设备的*** |
CN109124488B (zh) * | 2017-06-16 | 2022-04-12 | 德国福维克控股公司 | 具有至少两个地面处理装置的***和运行该***的方法 |
CN109124488A (zh) * | 2017-06-16 | 2019-01-04 | 德国福维克控股公司 | 具有至少两个地面处理装置的*** |
DE102017113288A1 (de) * | 2017-06-16 | 2018-12-20 | Vorwerk & Co. Interholding Gmbh | System mit mindestens zwei Bodenbearbeitungseinrichtungen |
EP3416019A1 (de) * | 2017-06-16 | 2018-12-19 | Vorwerk & Co. Interholding GmbH | System mit mindestens zwei sich selbsttätig fortbewegenden bodenbearbeitungsgeräten |
EP3415070A1 (de) * | 2017-06-16 | 2018-12-19 | Vorwerk & Co. Interholding GmbH | System mit mindestens zwei bodenbearbeitungseinrichtungen |
US11380320B2 (en) | 2017-09-25 | 2022-07-05 | Positec Power Tools (Suzhou) Co., Ltd. | Electric tool system, charger, electric tool, and voice control method thereof, automatic working system, charging station, self-moving device, and voice control method thereof |
EP3690597A4 (en) * | 2017-09-25 | 2021-06-09 | Positec Power Tools (Suzhou) Co., Ltd | POWER TOOL SYSTEM, CHARGER, POWER TOOL AND ASSOCIATED VOICE COMMAND PROCESS, AND AUTOMATIC WORK SYSTEM, CHARGING STATION, AUTOMATIC MOVING DEVICE AND ASSOCIATED VOICE COMMAND PROCESS |
US11656082B1 (en) * | 2017-10-17 | 2023-05-23 | AI Incorporated | Method for constructing a map while performing work |
EP3727122A4 (en) * | 2017-12-18 | 2021-08-11 | LG Electronics Inc. | ROBOT CLEANERS AND THEIR CONTROL PROCESS |
EP3508938B1 (en) * | 2018-01-05 | 2022-09-28 | iRobot Corporation | Mobile cleaning robot teaming and persistent mapping |
US11614746B2 (en) | 2018-01-05 | 2023-03-28 | Irobot Corporation | Mobile cleaning robot teaming and persistent mapping |
US11538350B2 (en) * | 2018-11-08 | 2022-12-27 | Hyundai Motor Company | Service robot and method for operating thereof |
US20200152073A1 (en) * | 2018-11-08 | 2020-05-14 | Hyundai Motor Company | Service robot and method for operating thereof |
US11308788B2 (en) | 2019-02-06 | 2022-04-19 | Ecolab Usa Inc. | Hygiene management for reducing illnesses and infections caused by ineffective hygiene practices |
US11430321B2 (en) | 2019-02-06 | 2022-08-30 | Ecolab Usa Inc. | Reducing illnesses and infections caused by ineffective cleaning by tracking and controlling cleaning efficacy |
US11804124B2 (en) | 2019-02-06 | 2023-10-31 | Ecolab Usa Inc. | Reducing illnesses and infections caused by ineffective cleaning by tracking and controlling cleaning efficacy |
CN110780779A (zh) * | 2019-09-25 | 2020-02-11 | 北京爱接力科技发展有限公司 | 一种机器人服务方法、装置和机器人终端 |
WO2021068699A1 (zh) * | 2019-10-08 | 2021-04-15 | 炬星科技(深圳)有限公司 | 机器人控制方法、电子设备及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
KR20170048815A (ko) | 2017-05-10 |
EP3342324C0 (en) | 2024-01-17 |
US20180317725A1 (en) | 2018-11-08 |
AU2016346447B2 (en) | 2021-12-16 |
KR102521493B1 (ko) | 2023-04-14 |
EP3342324B1 (en) | 2024-01-17 |
EP3342324A4 (en) | 2019-02-13 |
AU2016346447A1 (en) | 2018-04-26 |
CN108366707A (zh) | 2018-08-03 |
US11019972B2 (en) | 2021-06-01 |
CN108366707B (zh) | 2022-03-08 |
EP3342324A1 (en) | 2018-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017073955A1 (ko) | 청소 로봇 및 그 제어방법 | |
WO2020222340A1 (ko) | 인공지능 로봇과 그의 제어 방법 | |
AU2019336870B2 (en) | Plurality of autonomous mobile robots and controlling method for the same | |
WO2018101776A1 (en) | Apparatus and method for controlling light | |
WO2016122143A1 (en) | Method and apparatus for improving and monitoring sleep | |
WO2020218652A1 (ko) | 공기 청정기 | |
WO2017200353A1 (ko) | 로봇 청소기 | |
WO2015178562A1 (en) | Method and apparatus for providing notification | |
WO2014175605A1 (en) | Cleaning robot, home monitoring apparatus, and method for controlling the cleaning robot | |
EP3250110A1 (en) | Method and apparatus for improving and monitoring sleep | |
WO2020050494A1 (en) | A robot cleaner and a controlling method for the same | |
WO2016111556A1 (en) | Method of wirelessly connecting devices, and device thereof | |
WO2017010760A1 (en) | Hub apparatus and method for providing service thereof | |
WO2021261747A1 (ko) | 슈즈 케어 장치 및 방법 | |
WO2019151845A2 (ko) | 에어컨 | |
WO2016089083A1 (en) | Attachment device and method for controlling electronic device thereof | |
WO2019050227A1 (ko) | 공기조화기의 동작 방법 | |
WO2021261745A1 (ko) | 슈즈 케어 장치 및 방법 | |
WO2018048098A1 (en) | Portable camera and controlling method therefor | |
WO2020004895A1 (ko) | 로봇 | |
WO2022075758A1 (ko) | 전자 장치 및 이의 제어 방법 | |
WO2019004773A1 (ko) | 이동 단말기 및 이를 포함하는 로봇 시스템 | |
WO2023101228A1 (ko) | 로봇 친화형 건물, 복수의 로봇들을 이용한 협업 방법 및 시스템 | |
EP3320655A1 (en) | Hub apparatus and method for providing service thereof | |
WO2021261749A1 (ko) | 슈즈 케어 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16860138 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016860138 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016346447 Country of ref document: AU Date of ref document: 20161020 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15771350 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |