CN113171472B - Disinfection robot - Google Patents

Disinfection robot Download PDF

Info

Publication number
CN113171472B
CN113171472B CN202010454119.1A CN202010454119A CN113171472B CN 113171472 B CN113171472 B CN 113171472B CN 202010454119 A CN202010454119 A CN 202010454119A CN 113171472 B CN113171472 B CN 113171472B
Authority
CN
China
Prior art keywords
sterilization
robot
user
disinfection
sterilization robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010454119.1A
Other languages
Chinese (zh)
Other versions
CN113171472A (en
Inventor
吴骁伟
郭成凯
孙广石
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Wangfu Beijing Technology Co ltd
Original Assignee
Zhongke Wangfu Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Wangfu Beijing Technology Co ltd filed Critical Zhongke Wangfu Beijing Technology Co ltd
Priority to CN202010454119.1A priority Critical patent/CN113171472B/en
Publication of CN113171472A publication Critical patent/CN113171472A/en
Application granted granted Critical
Publication of CN113171472B publication Critical patent/CN113171472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/02Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using physical phenomena
    • A61L2/08Radiation
    • A61L2/10Ultraviolet radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/24Apparatus using programmed or automatic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L9/00Disinfection, sterilisation or deodorisation of air
    • A61L9/16Disinfection, sterilisation or deodorisation of air using physical phenomena
    • A61L9/18Radiation
    • A61L9/20Ultraviolet radiation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/14Means for controlling sterilisation processes, data processing, presentation and storage means, e.g. sensors, controllers, programs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2209/00Aspects relating to disinfection, sterilisation or deodorisation of air
    • A61L2209/10Apparatus features
    • A61L2209/11Apparatus for controlling air treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2209/00Aspects relating to disinfection, sterilisation or deodorisation of air
    • A61L2209/10Apparatus features
    • A61L2209/11Apparatus for controlling air treatment
    • A61L2209/111Sensor means, e.g. motion, brightness, scent, contaminant sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

A sterilization robot is disclosed. The sterilization robot comprises an electronic control unit, a first communication unit and an image acquisition device, wherein the electronic control unit is configured to: when a living body exists around the disinfection robot, determining whether the living body is an adult, and if so, communicating with a nameplate of the living body via the first communication unit to acquire identification information of a user; determining whether the user has gesture instruction authority or not based on the identification information, if yes, acquiring a gesture image of the user by using the image acquisition device, and analyzing to obtain a gesture; and executing operations corresponding to the gestures, wherein the operations comprise at least one of stopping, turning left, turning right, backing, going upstairs and going downstairs. Thus, the disinfection robot can recognize the authority of the user and execute corresponding operations according to the gestures of the user with the authority so as to efficiently complete the disinfection task and ensure that the disinfection robot is not interfered by unauthorized users.

Description

Disinfection robot
Technical Field
The present invention relates to a sterilization robot.
Background
When the existing disinfection robot performs disinfection work in areas such as hospitals, the typical obstacle can be generally identified, and a route is re-planned according to the position of the obstacle, so that the function of obstacle avoidance is achieved. However, when the living body appears around the disinfection robot, the ultraviolet disinfection of the disinfection robot can cause physiological damage to the living body, and only the background server can receive the re-planned route, and the living body still possibly appears on the route, so that the disinfection work is interfered, which not only causes the consumption of computing resources, but also fails to accurately avoid the living body.
Further, the sterility and safety requirements of the sterilization robot itself place higher demands on the way the user interacts with. The existing disinfection robot cannot accurately and rapidly identify the living body and carry out evasive operation. In addition, the existing disinfection robot cannot recognize operation instructions transmitted by organisms through body language such as gestures. In addition, existing sterilization robots do not provide targeted responses to different types of surrounding organisms, such as in shopping malls, kindergartens or childhood hospitals, where children are interested in the sterilization robot, and frequent some of their limb languages may interfere with the sterilization robot's receipt of instructions, such as by children or other organisms that do not have rights, which greatly affects the sterilization robot's proper sterilization operation. The responses to the limb language of the child are currently indistinguishable from those of other organisms, resulting in significant time and resources being expended by the sterilization robot in handling these ineffective limb languages, and even making erroneous decisions and operations.
Disclosure of Invention
The present invention has been made in view of the above-mentioned problems, and it is an object of the present invention to provide a sterilization robot that can distinguish peripheral living bodies, recognize rights of users, and perform corresponding operations according to gestures of users having rights, and throw no computing resources into the disturbing gestures of children, ensuring no interference by unauthorized users, thereby remarkably reducing the computing load, and efficiently completing sterilization tasks.
According to one aspect of the present disclosure, there is provided a sterilization robot including an electronic control unit, a first communication unit, and an image acquisition device, wherein the electronic control unit is configured to: when a living body exists around the disinfection robot, determining whether the living body is an adult, and if so, communicating with a name plate of the living body through a first communication unit to acquire identification information of a user; determining whether the user has gesture instruction authority or not based on the identification information, if yes, acquiring a gesture image of the user by using the image acquisition device, and analyzing to obtain a gesture; and executing operations corresponding to the gestures, wherein the operations comprise at least one of stopping, turning left, turning right, backing, going upstairs and going downstairs.
In some embodiments, the image acquisition device is further configured to: detecting whether an object exists around the disinfection robot, and sending a first indication signal for indicating whether the object exists; the sterilization robot further includes: a motion base configured with wheels; an ultraviolet emitter disposed on the moving base and configured to emit ultraviolet rays to achieve sterilization of the peripheral region; an infrared sensor configured to: receiving a first indication signal, detecting whether an existing object is an organism, and transmitting a second indication signal indicating the existence of the organism; and an alarm configured to: and receiving a second indication signal, reminding by at least one mode of voice and lamplight, or turning off the ultraviolet emitter.
In some embodiments, identification information is stored in the nameplate, the identification information including rights information indicating whether the user has gesture instruction rights and body type information of the user; the electronic control unit is further configured to: and carrying out body type analysis on the image of the user acquired by the image acquisition device to determine whether the image is consistent with the body type information pre-stored in the identification information so as to avoid the name plate from being falsely used.
In some embodiments, the image acquisition device comprises an area array lidar, the distance between the object and the sterilization robot is measured by the area array lidar; only when the distance is within a predetermined distance range, an image of the user is acquired for analysis.
In some embodiments, the sterilization robot further comprises a navigation unit that locally stores a map of the hospital; the electronic control unit presets a disinfection area for each disinfection robot and communicates with a background server to update the driving route of each disinfection robot in real time; when a sterilization robot is to enter a sterilization zone, it is determined whether the sterilization zone has been covered by the travel route of other sterilization robots, if so, sterilization is suspended and exits the sterilization zone on the shortest route.
In some embodiments, the electronic control unit is further configured to determine a shortest travel route covering the at least one non-sterile field when the at least one non-sterile field is present, and send the shortest travel route to the background server.
In some embodiments, when the sterilization robot has entered a sterilization zone, the sterilization robot is prompted to have been sterilized for a segment of the travel route, the length of the segment is determined, if the length is longer than a threshold value, the sterilization robot is caused to turn off the ultraviolet emitter in the segment, and if the length is not longer than the threshold value, the ultraviolet emitter is maintained on.
In some embodiments, the navigation unit determines the nearest elevator when the sterilizing robot transfers floors, issues a prompt to maintain a safe distance via an alarm after the sterilizing robot enters the elevator, and determines to reach the target floor via the image acquisition device.
Therefore, the disinfection robot can receive the receiving instruction by identifying the users with the authority, adjust the disinfection route in time and avoid the interference of unauthorized users.
Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The same reference numerals with letter suffixes or different letter suffixes may represent different instances of similar components. The accompanying drawings illustrate various embodiments by way of example in general and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Such embodiments are illustrative and not intended to be exhaustive or exclusive of the present apparatus or method.
Fig. 1 is a schematic structural view of a sterilization robot according to an embodiment of the present invention;
FIG. 2 is a flow chart of one example of a sterilization robot according to an embodiment of the present invention;
fig. 3 is a block diagram schematically showing one example of components provided in the sterilization robot.
Detailed Description
In order to better understand the technical solutions of the present disclosure, the following detailed description of the present disclosure is provided with reference to the accompanying drawings and the specific embodiments. Embodiments of the present disclosure will be described in further detail below with reference to the drawings and specific embodiments, but not by way of limitation of the present disclosure. The order in which the steps are described herein by way of example should not be construed as limiting if there is no necessity for a relationship between each other, and it should be understood by those skilled in the art that the steps may be sequentially modified without disrupting the logic of each other so that the overall process is not realized.
Fig. 1 is a schematic view of a sterilization robot according to an embodiment of the present invention. This is by way of example only, and the present disclosure may be implemented as an autonomously movable, sterile robot of any configuration.
As shown in fig. 1, the configuration structure of the sterilization robot at least includes: an infrared sensor 1, an ultraviolet emitter 2, a navigation unit 3, a motion base 4, a battery and a data analysis main board 5. The battery may be directly mounted on the data analysis motherboard 5. The data analysis motherboard 5 may include at least a first communication unit 501 and an electronic control unit 502; the motion base 4 is configured as a wheeled (and in some embodiments may also include a crane mounted on the motion base 4), the image acquisition device 6 may be mounted on the motion base 4 (e.g., may also be mounted on the crane), the image acquisition device 6 being configured to detect whether an object is present around the sterilization robot and to send a first indication signal indicating whether an object is present. In some embodiments, the image acquisition device 6 may be a camera or a laser radar. In some embodiments, the image acquisition device 6 is configured as an area array lidar 601 to more accurately detect the distance and activity of surrounding objects (e.g., without limitation, gestures, movements, etc.), and may also accurately monitor the response of an organism after interaction with the sterilization robot (e.g., whether it is far from a safe range).
In some embodiments, the navigation unit 3 may be configured to locate the sterilization robot, may be implemented using GPS or the like technology to determine its real-time location in the workplace (e.g., hospital, shopping place), which floor, which room, which shop, etc.
As shown in fig. 1, an ultraviolet emitter 2 is provided on the motion base 4 and configured to emit ultraviolet rays to achieve sterilization of the peripheral region; the infrared sensor 1 is configured to receive a first indication signal, detect whether an existing object is an organism, and transmit a second indication signal indicating the existence of the organism. The sterilization robot further includes an alarm 7, the alarm 7 may be mounted on any structure that facilitates the generation of an alarm prompt, not specifically limited herein, and the alarm 7 is configured to receive a second indication signal, alert via at least one of voice and light, or turn off the ultraviolet emitter 2. The alarm 7 may cooperate with the area array lidar and the infrared sensor 1, for example, in case the area array lidar detects a surrounding object and the object is an object, it is determined whether the object is within a safe distance range, if not, it is alerted to leave via the alarm 7, if the area array lidar still does not detect that the object is moving within the safe distance range within a following preset period of time, the ultraviolet emitter 2 is turned off to avoid injury to the object.
Fig. 2 is a flowchart of one example of a sterilization robot according to an embodiment of the present invention, wherein the sterilization robot comprises a first communication unit 501, an electronic control unit 502 and an image acquisition device 6, wherein the electronic control unit 502 is configured to perform the following steps. In step S201, if a living body exists around the sterilization robot, it is determined whether or not the living body is an adult.
If not an adult, such as a child or pet, the organism is defaults to not further identify it and more advanced gesture-performing analysis work, but rather only alerts the organism to exit using at least one of voice and light. In some embodiments, it is also possible to turn off the ultraviolet emitter 2 while reminding it to leave via the alarm 7, in case it is determined not to be an adult, so as to avoid ultraviolet injury to children or pets with low alarm compliance.
If it is an adult, communication with the name plate of the living body via the first communication unit 501 to acquire identification information of the user; s202, based on the identification information, determining whether the user has gesture instruction authority, if so, acquiring a gesture image of the user by using the image acquisition device 6, and analyzing to obtain a gesture; and executing operations corresponding to the gestures, wherein the operations comprise at least one of stopping, turning left, turning right, backing, going upstairs and going downstairs. In this way, by letting the staff of the sterilizing robot's place of operation wear a specific nameplate, the sterilizing robot can accurately distinguish it from the normal adult user or staff, and can accurately distinguish the staff having different rights from each other, thereby limiting the crowd capable of interacting with the sterilizing robot via gesture instructions, so as to ensure the use safety of the sterilizing robot.
In some embodiments, the image capturing device 6 detects whether an object, such as a step, a medical instrument, a handrail, a seat, a living body, etc., is present around the sterilization robot, and transmits a first indication signal indicating whether the object is present to the infrared sensor 1. The infrared sensor 1 receives the first indication signal, detects whether or not an existing object is an organism based on the first indication signal, and transmits a second indication signal indicating the existence of the organism to the alarm 7. The alarm 7 receives the second indication signal and reminds surrounding organisms to avoid by means of voice and/or light, or turns off the ultraviolet emitter 2 in the disinfection work so as to avoid the damage of ultraviolet rays to the physiological condition of the organisms.
In S201, a second instruction signal indicating the presence of a living body is transmitted to the electronic control unit 502 of the sterilization robot, and when the living body is present around the sterilization robot, the electronic control unit 502 further determines whether or not the living body is an adult, and when the living body is an adult, the electronic control unit communicates with the name plate of the living body via the first communication unit 501 to acquire identification information of the user. In some embodiments, the electronic control unit 502 is configured to: in case that no avoidance of surrounding living organisms is detected via the area array lidar 601, this may be because the living organisms are children or pets not listening to the warning, or because the living organisms need to gesture the user with gesture command authority within a sufficiently close range, the ultraviolet transmitter 2 in the disinfection work is turned off to avoid damage to the physiological condition of the living organisms.
In some embodiments, identification information is stored in the nameplate, the identification information including rights information indicating whether the user has gesture instruction rights and body type information of the user; only the gesture command issued by the user with gesture command authority to the sterilizing robot can be successfully received and recognized, and the corresponding operation is performed by the sterilizing robot. The electronic control unit 502 is further configured to: the body type analysis is performed on the image of the user acquired by the image acquisition device 6 to determine whether it is consistent with the body type information pre-stored in the identification information, so as to avoid the name plate from being falsified. In the process of identifying the user with the permission, besides directly scanning the user name plate to acquire the corresponding permission, whether the user corresponding to the name plate is the current user wearing the name plate or not needs to be further determined according to the body type characteristics of the user, so that the situation that the name plate is falsely used by other users is avoided, and the safety of executing control on the sterilizing robot is further improved.
In S202, based on the identification information of the nameplate, determining whether the user has gesture instruction authority, if yes, acquiring a gesture image of the user by using the image acquisition device 6, and obtaining a gesture by analysis; and executing operations corresponding to the gestures, wherein the operations comprise at least one of stopping, turning left, turning right, backing, going upstairs and going downstairs. The background server receives the operation instructions which are transmitted by the limb language such as gestures and can be utilized by the disinfection robot through identifying the organisms with gesture instruction authority (on-site medical staff and the like), wherein the operation instructions are possibly caused by the fact that the background server acquires the operation places on which the road conditions are not updated timely (such as temporary obstacles and organisms and the like), the estimated inaccuracy (such as rapid aggregation of a large number of people caused by emergency conditions in hospitals) still exist.
In some embodiments, the image acquisition device 6 may further comprise an area array laser radar 601, the area array laser radar 601 being used to measure the distance of the object from the sterilization robot; only when the distance is within a predetermined distance range, an image of the user is acquired for analysis. That is, when the distance between the object and the sterilization robot is too long, the ranging function of the area array lidar 601 is not activated yet to avoid unnecessary waste of computing resources. In addition, images of the user for analysis, such as gesture instructions, are all taken within a predetermined distance range, thus eliminating distortion of image features due to perspective, simplifying analysis processing, and making the recognition algorithm simpler and more accurate. Specifically, in the case where, when the identification information of the user is acquired by communicating with the name plate of the living body via the first communication unit 501, the image of the user is acquired for scanning analysis (for example, using an image sensor or a scanner, etc.) when the distance is within a predetermined distance range, the longitudinal-transverse scanning range of the name plate by the first communication unit 501 may be set in advance, and only this preset range may be scanned to read the identification information of the user. Therefore, the scanning and processing speed of the nameplate can be remarkably increased, and the response time of the disinfection robot to peripheral emergency is shortened.
In some embodiments, the first communication unit 501 and the nameplate may communicate in various wireless communication modes, including, but not limited to, any of code scanning with an image sensor, NFC (near field communication), bluetooth communication mode. In some embodiments, the image sensor may be used to scan the code, so that the configuration of the nameplate may be passive (without transmitting power), further reducing costs.
In some embodiments, the sterilization robot may further comprise a navigation unit 3, which locally stores a map of the hospital. In some application scenarios, the sterilization robots may be used in groups. The electronic control unit 502 may preset a sterilization area for each sterilization robot and communicate with a background server to update a travel route of each sterilization robot in real time; the background server is used for realizing the dispatching of each disinfection robot. When a sterilization robot is to enter a sterilization zone, it is determined whether the sterilization zone has been covered by the travel route of other sterilization robots, if so, sterilization is suspended and exits the sterilization zone on the shortest route. Therefore, the repeated disinfection of the same area can be avoided by avoiding the superposition of the routes of the disinfection robots, and the efficiency of a group of disinfection robots is improved. The real-time updating of the travel routes of the disinfection robots enables the travel routes of other disinfection robots in the same group to be adaptively adjusted based on the real-time travel routes of the disinfection robots, thereby optimizing the disinfection efficiency of the whole group.
In some embodiments, the electronic control unit 502 is further configured to determine a shortest travel route covering the at least one non-sterile field when the at least one non-sterile field is present, and send the shortest travel route to the background server. In the disinfection work, when the disinfection robot runs to a certain area, whether the area is disinfected or not is determined, if the area is not disinfected, a route planning function of the electronic control unit 502 is triggered, a shortest running route covering at least one non-disinfected area is calculated, and the shortest running route is sent to a background server to update the real-time dispatching condition of the disinfection route of each disinfection robot of the background server in real time.
In some embodiments, when the sterilization robot has entered a sterilization zone, the sterilization robot is prompted to have been sterilized for a section of the travel route, the length of the section is determined, if the length is longer than a threshold value, the sterilization robot is caused to turn off the ultraviolet emitter 2 in the section, and if the length is not longer than the threshold value, the ultraviolet emitter 2 is maintained on. Although the ultraviolet emitter 2 may be turned off in the already sterilized segment, the lifetime of the ultraviolet emitter 2 may be deteriorated due to frequent switching. When the sterilization robot travels in the current route, if the segment of the travel route that has been sterilized on the current route is longer than the threshold value, it is considered that the power consumption saved by turning off the ultraviolet emitter 2 has priority over the adverse effect of turning off-on the ultraviolet emitter 2. When the segment is not longer than the threshold value, it is considered that the adverse effect of the turn-off-on to the ultraviolet emitter 2 takes precedence over the power consumption saved by turning off the ultraviolet emitter 2, and the on state of the ultraviolet emitter 2 is maintained for continuous sterilization.
In some embodiments, the navigation unit 3 determines the nearest elevator when the sterilizing robot transfers floors, after entering the elevator, issues a prompt to maintain a safe distance via the alarm 7, and determines via the image acquisition means 6 that the target floor is reached. In some embodiments, the sterilization robot may directly shut off the emission of ultraviolet rays when entering the elevator and down-regulate the safe distance range for the surrounding organisms.
Fig. 3 is a block diagram schematically showing one example of components provided in the sterilization robot. In the sterilization robot system, an infrared sensor 1, an ultraviolet emitter 2, a navigation unit 3, a motion base 4, a battery (not shown) and a data analysis main board 5 (including a first communication unit 501 and an electronic control unit 502), an image acquisition device 6 (which may include, but is not limited to, an area array laser radar 601), and an alarm 77 are included. The components are connected through a bus to form a sterilizing robot system so as to maintain the normal operation of the sterilizing robot. The sterilization robot can be remotely connected with a background server through the first communication unit 501, so that the dispatching and data transmission of the sterilization robot by the background server can be realized.
In some embodiments, the electronic control unit may be implemented via various processors, and the various steps performed by the electronic control unit may be implemented by the processors executing computer executable instructions stored on memory. In some embodiments, the processor may include, but is not limited to, any one or more of a microprocessor, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), an SOC (system on a chip), a DSP (digital signal processing) chip.
In summary, the invention discloses a disinfection robot, which can identify organisms appearing around the disinfection robot, so as to send out an alarm to prompt the organisms to avoid, thereby avoiding physiological injury caused by ultraviolet rays. In addition, the disinfection robot can also recognize operation instructions transmitted by body language (such as gestures) of the living body, and compared with the situation that the living body can still appear on the route only by receiving the re-planned route through the background server, the disinfection robot can timely and accurately avoid the living body. Further, by identifying the nameplate and reminding the user, the normal disinfection work of the disinfection robot is prevented from being influenced by children or other organisms without permission.
Although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of the various embodiments across schemes), adaptations or alterations based on the present disclosure. Elements in the claims are to be construed broadly based on the language employed in the claims and are not limited to examples described in the present specification or during the practice of the present application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the disclosure. This is not to be interpreted as an intention that the disclosed features not being claimed are essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with one another in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (8)

1. A sterilization robot comprising an electronic control unit, a first communication unit, an infrared sensor, an alarm, and an image acquisition device configured to: detecting whether an object exists around the disinfection robot, and sending a first indication signal for indicating whether the object exists;
an infrared sensor configured to: receiving the first indication signal, detecting whether the existing object is an organism, and sending a second indication signal for indicating the existence of the organism;
an alarm configured to: receiving the second indication signal, reminding by at least one mode of voice and lamplight, or closing an ultraviolet transmitter;
the electronic control unit is configured to: when a living body exists around the disinfection robot, determining whether the living body is an adult, and if so, communicating with a nameplate of the living body via the first communication unit to acquire identification information of a user; if not, not performing the analysis work on the identity recognition and the higher-level gestures of the organism, but reminding the organism to leave by utilizing at least one of the voice and the lamplight; and
based on the identification information, determining whether the user has gesture instruction authority, if yes, acquiring a gesture image of the user by using the image acquisition device, and analyzing to obtain a gesture; and executing operations corresponding to the gestures, wherein the operations comprise at least one of stopping, turning left, turning right, backing, going upstairs and going downstairs.
2. The sterilization robot of claim 1, further comprising:
a motion base configured with wheels; and
an ultraviolet emitter disposed on the motion base and configured to emit ultraviolet rays to achieve sterilization of the peripheral region.
3. The sterilization robot according to claim 1, wherein identification information is stored in the nameplate, the identification information including authority information indicating whether a user has the gesture instruction authority and body type information of the user; the electronic control unit is further configured to: and carrying out body type analysis on the image of the user acquired by the image acquisition device to determine whether the image is consistent with the body type information pre-stored in the identification information so as to avoid the name plate from being falsely used.
4. The sterilization robot according to claim 1, wherein the image acquisition device includes an area array lidar with which a distance between the object and the sterilization robot is measured; and acquiring an image of the user for analysis when the distance is within a predetermined distance range.
5. The sterilization robot according to claim 2, further comprising a navigation unit that locally stores a map of a hospital; the electronic control unit presets a disinfection area for each disinfection robot and communicates with a background server to update the driving route of each disinfection robot in real time; when the sterilization robot is about to enter a sterilization area, determining whether the sterilization area is already covered by the driving route of other sterilization robots, if so, suspending sterilization and driving out of the sterilization area in the shortest route.
6. The sterilization robot of claim 5, wherein the electronic control unit is further configured to determine a shortest travel route covering at least one non-sterilized area when the at least one non-sterilized area exists, and send the shortest travel route to the background server.
7. The sterilization robot of claim 5, wherein when the sterilization robot has entered one of the sterilization areas, prompting a section of the travel route that the sterilization robot has been sterilized, determining a length of the section, if the length is longer than a threshold value, causing the sterilization robot to turn off the ultraviolet emitter in the section, and if the length is not longer than the threshold value, maintaining the ultraviolet emitter on.
8. The sterilization robot according to claim 5, characterized in that the navigation unit determines the nearest elevator when the sterilization robot changes floors, and that the sterilization robot after entering the elevator gives a prompt to maintain a safe distance via the alarm and determines the arrival at the destination floor via the image acquisition device.
CN202010454119.1A 2020-05-26 2020-05-26 Disinfection robot Active CN113171472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010454119.1A CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010454119.1A CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Publications (2)

Publication Number Publication Date
CN113171472A CN113171472A (en) 2021-07-27
CN113171472B true CN113171472B (en) 2023-05-02

Family

ID=76921411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010454119.1A Active CN113171472B (en) 2020-05-26 2020-05-26 Disinfection robot

Country Status (1)

Country Link
CN (1) CN113171472B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016106294A (en) * 2015-12-28 2016-06-16 墫野 和夫 Fully automatic robot household electric system appliance
CN106622728A (en) * 2017-02-28 2017-05-10 北京兆维电子(集团)有限责任公司 Fog gun type belt anti-epidemic robot

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110003146A (en) * 2009-07-03 2011-01-11 한국전자통신연구원 Apparatus for econgnizing gesture, robot system using the same and method for econgnizing gesture using the same
JP2013065112A (en) * 2011-09-15 2013-04-11 Omron Corp Gesture recognition device, electronic apparatus, control method of gesture recognition device, control program, and recording medium
US10022041B2 (en) * 2012-06-27 2018-07-17 Camplex, Inc. Hydraulic system for surgical applications
CN105334851A (en) * 2014-08-12 2016-02-17 深圳市银星智能科技股份有限公司 Mobile device capable of sensing gesture
CN107765855A (en) * 2017-10-25 2018-03-06 电子科技大学 A kind of method and system based on gesture identification control machine people motion
CN107894836B (en) * 2017-11-22 2020-10-09 河南大学 Human-computer interaction method for processing and displaying remote sensing image based on gesture and voice recognition
CN108776473A (en) * 2018-05-23 2018-11-09 上海圭目机器人有限公司 A kind of working method of intelligent disinfecting robot
CN108958490B (en) * 2018-07-24 2021-09-17 Oppo(重庆)智能科技有限公司 Electronic device, gesture recognition method thereof and computer-readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016106294A (en) * 2015-12-28 2016-06-16 墫野 和夫 Fully automatic robot household electric system appliance
CN106622728A (en) * 2017-02-28 2017-05-10 北京兆维电子(集团)有限责任公司 Fog gun type belt anti-epidemic robot

Also Published As

Publication number Publication date
CN113171472A (en) 2021-07-27

Similar Documents

Publication Publication Date Title
US11684526B2 (en) Patient support apparatuses with navigation and guidance systems
AU2009237932B2 (en) Position-monitoring device for persons
US9235216B2 (en) Running information generating apparatus of autonomous running apparatus, running information generating method, running information generating program, and autonomous running apparatus
US20070247316A1 (en) Article locating and tracking apparatus and method
EP2317700A1 (en) A system and method for monitoring hygiene standards compliance
CN109414240B (en) Method for operating an at least partially autonomously mobile medical treatment unit and mobile medical treatment unit
US20220262183A1 (en) Control system, control method, and program
JP2013506200A (en) Hygiene monitoring system and method
US10271772B2 (en) Systems and methods for warning of a protruding body part of a wheelchair occupant
KR20110108264A (en) Robot apparatus, information providing method carried out by the robot apparatus and computer storage media
CN113945890A (en) Position tracking and distance monitoring system and related method
JP2015108544A (en) Control device, control system and program
JP2007150435A (en) Communication system
EP3570134B1 (en) System for evacuating one or more mobile robots
CN108898815A (en) Promote the method and control device, medical system of Medical Devices remote control security
JPH0511039A (en) Presence confirming system
US11971721B2 (en) Autonomous mobile robot control system, control method thereof, a non-transitory computer readable medium storing control program thereof, and autonomous mobile robot control device
CN113171472B (en) Disinfection robot
US20200288675A1 (en) Wearable sensor device to assist vision-impaired animal
KR20220052073A (en) Driving control method of roving robot with detection area indicator
Jeyaseelan WR et al. Efficient Intelligent Smart Ambulance Transportation System using Internet of Things
US20210089043A1 (en) System for cooperative movement control and/or movement supervision of mobile medical components
JP2004251816A (en) Apparatus and system for managing position of moving body
US10940796B2 (en) Intent communication for automated guided vehicles
CN113311839A (en) Intelligent robot control method and system for public area disinfection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant