GB2604781A - Robot air filter - Google Patents

Robot air filter Download PDF

Info

Publication number
GB2604781A
GB2604781A GB2206738.3A GB202206738A GB2604781A GB 2604781 A GB2604781 A GB 2604781A GB 202206738 A GB202206738 A GB 202206738A GB 2604781 A GB2604781 A GB 2604781A
Authority
GB
United Kingdom
Prior art keywords
robot
air
filter
air filter
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2206738.3A
Other versions
GB2604781B (en
Inventor
Fox Harry
Sapojnikov Sergh
Gorelick Andrew
Bachman Gabriel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
X Tend Robotics Inc
Original Assignee
X Tend Robotics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by X Tend Robotics Inc filed Critical X Tend Robotics Inc
Publication of GB2604781A publication Critical patent/GB2604781A/en
Application granted granted Critical
Publication of GB2604781B publication Critical patent/GB2604781B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F8/00Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying
    • F24F8/10Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying by separation, e.g. by filtering
    • F24F8/108Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying by separation, e.g. by filtering using dry filter elements
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F1/00Room units for air-conditioning, e.g. separate or self-contained units or units receiving primary air from a central station
    • F24F1/02Self-contained room units for air-conditioning, i.e. with all apparatus for treatment installed in a common casing
    • F24F1/04Arrangements for portability
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F1/00Room units for air-conditioning, e.g. separate or self-contained units or units receiving primary air from a central station
    • F24F1/02Self-contained room units for air-conditioning, i.e. with all apparatus for treatment installed in a common casing
    • F24F1/0328Self-contained room units for air-conditioning, i.e. with all apparatus for treatment installed in a common casing with means for purifying supplied air
    • F24F1/035Self-contained room units for air-conditioning, i.e. with all apparatus for treatment installed in a common casing with means for purifying supplied air characterised by the mounting or arrangement of filters
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/50Control or safety arrangements characterised by user interfaces or communication
    • F24F11/56Remote control
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/62Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
    • F24F11/63Electronic processing
    • F24F11/64Electronic processing using pre-stored data
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F8/00Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying
    • F24F8/20Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying by sterilisation
    • F24F8/22Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying by sterilisation using UV light
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F8/00Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying
    • F24F8/30Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying by ionisation
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F8/00Treatment, e.g. purification, of air supplied to human living or working spaces otherwise than by heating, cooling, humidifying or drying
    • F24F8/80Self-contained air purifiers
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2120/00Control inputs relating to users or occupants
    • F24F2120/10Occupancy
    • F24F2120/12Position of occupants
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2130/00Control inputs relating to environmental factors not covered by group F24F2110/00
    • F24F2130/10Weather information or forecasts
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2221/00Details or features not otherwise provided for
    • F24F2221/12Details or features not otherwise provided for transportable
    • F24F2221/125Details or features not otherwise provided for transportable mounted on wheels
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2221/00Details or features not otherwise provided for
    • F24F2221/38Personalised air distribution
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F2221/00Details or features not otherwise provided for
    • F24F2221/42Mobile autonomous air conditioner, e.g. robots

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Accommodation For Nursing Or Treatment Tables (AREA)

Abstract

A mobile robot providing clean air to different locations comprises a housing resistant to UV radiation irradiated from a UV source (602, fig 6A) located within the housing and for disinfecting air, and a fan 104 communicating the air from an intake of the housing through a filter (604) to remove particulates from the air and communicates the air via the UV source to provide clean, disinfected air flow expelled from the housing and directed towards a desired target or direction. The filter may be a HEPA filter, may be removable and the air may be ionised. The robot may autonomously move and comprise means for recognising a person or the desired direction, and may comprise a plurality of fans communicating the air through a body 108 of the robot, through tubes 106 and out through vents 110. A memory may store information about potential targets and identification characteristics of potential targets, and a processing unit analysing external information to determine the identification characteristics to direct clean air toward the target. The robot may detect or track targets using a camera and comprises a means to orient the robot to face the target and control operation of the fan(s).

Description

Robot Air Filter
Field of the Invention
This invention pertains to the field of robotics, and, in particular, to a robot with a decontamination filter.
Background of the Invention
Ultraviolet (UV) is a form of electromagnetic radiation with a wavelength from 10 (with a corresponding frequency around 30 PHz) to 400 nm (750 THz), shorter than that of visible light, but longer than X-rays. Nam
Ultraviolet A Ultraviolet B Ultraviolet C
UV-B
UV-C
UV-A
4.43-12.4, 0.710-1.987 3.10-3.94, 0.497-0.631 3.94-4.43, 0.631-0.710 No sial nantes Long-wavc UV, black light, not absorbed by the ozone layer: soft UV.
Medium-wave UV, mostly absorbed by the ozone layer: intermediate UV; Dorno radiation.
Short-wave UV, germicidal UV, ionizing radiation at shorter wavelengths, completely absorbed by the ozone layer and atmosphere: hard UV. Near
ultraviolet 3.10-4.13, 0.497-0.662 Visible to birds, insects, and fish.
N-UV
Middle ultraviolet 4.13-6.20, 0.662-0.993
M-UV
Far ultraviolet F-UV 122- 6.20-10.16, Ionizing radiation at shorter wavelengths.
0.993-1.628 Hydrogen Lyman-alpha H Lyman-a 121- 10 16- Spectral line at 121.6 nm, 10.20 eV.
122 1 0 25, 1.628-1.642 Extreme ultraviolet E-UV 10-121 10 25-124, Entirely ionizing radiation by some definitions; completely absorbed by the atmosphere.
1.642-[9 867 Vacuum ultraviolet 10-200 6.20-124, Strongly absorbed by atmospheric oxygen, though 150-200 nm wavelengths can propagate through nitrogen.
0.993-19.867 The safest and most effective UV frequency used for germicidal purposes is UV-C, and particularly close to 222nm. One should be careful of old or cheap UV filters that can be made with mercury which can produce toxic ozone as well as dangerous mercury if misused. The EPA currently does not approve or certify any disinfectant UV products. Underwriters Laboratories (UL), however, does certify UV disinfecting products. One of the tests to perform on LW products is the Log inactivation test. "Log inactivation" is a convenient way to express the number or percent of microorganisms inactivated (killed or unable to replicate) through the disinfection process. For example, a 3-log inactivation value means that 99.9% of microorganisms of interest have been inactivated.
UV light can be harmful to humans, especially in high amounts, and most commonly comes from the sun. This is why sunscreen is used to protect the skin as well as UV protective sunglasses to protect the eyes. UV light can penetrate cells and affect the DNA/RNA and this can lead to disruption of cell reproduction. Therefore, it can be harmful to viruses, bacteria, and even humans. At the Frequency of UV-C, more particularly around 222nm, UV can easily pierce through viruses and bacteria, but cannot penetrate very fax through human skin. Therefore, this frequency is often used in commercial products.
Light-emitting diodes (LEDs) can be manufactured to emit radiation in the ultraviolet range. In 2019, following significant advances over the preceding five years, LW-A LEDs of 365 nm and longer wavelength were available, with efficiencies of 50% at 1.0W output. Currently, the most common types of UV LEDs are in 395 nm and 365 nm wavelengths, both of which are in the UV-A spectrum. When referring to the wavelength of the UV LEDs, the rated wavelength is the peak wavelength that the LEDs generate, and light at both higher and lower wavelength frequencies near the peak wavelength are present, which is important to consider when looking to apply them for certain purposes.
"[he cheaper and more common 395 nm UV LEDs are much closer to the visible spectrum. LEDs not only operate at their peak wavelength, but they also give off a purple color; and, do not emit pure UV light, unlike other UV LEDs that are deeper into the spectrum. Such LEDs are increasingly used for applications such as UV curing applications and charging glowin-the-dark objects (such as paintings or toys). They are becoming very popular in a process known as retro-brightening, which speeds up the process of refurbishing / bleaching old plastics and portable flashlights for detecting counterfeit money and bodily fluids. LEDS are already successful in digital print applications and inert UV curing environments. Power densities approaching 3 W/cm2 (30 kW/m2) are now possible, and this, coupled with recent developments by photo-initiator and resin formulators, makes the expansion of LED cured UV materials likely.
UV-C LEDs are developing rapidly, but may require testing to verify effective disinfection. Citations for large-area disinfection are for non-LED UV sources known as germicidal lamps Also, they are used as line sources to replace deuterium lamps in liquid chromatography instruments.
UV radiation can generally be contained with opaque materials, such as cardboard or wood. Transparent materials, such as glass, PVC (polyvinylchloride), plexiglass and Perspex, block UV radiation in varying degrees. Generally, carbonated plastics provide adequate UV protection. Some kinds of clear glass (Including some kinds of window glass and optical glass) transmit significant amounts of UV-A radiation.
Accordingly, there is a need in the industry and field for a mobile air filter.
Summary of the Invention
To achieve these and other objects, the herein device is a mobile robot with air filters to disinfect local air.
Therefore, to achieve these and other objects, the herein disclosed invention is an intelligent, multi-function robot includes: an housing built into the robot and having a UV blocking material, an air filter positioned within the housing; and a fan directing air flow through the air filter, an intake of the airflow and outflow of said airflow through the air filter being placed to direct clean air directed toward a targeted direction. The air filter may be an UV air cleaner/ ITEPA filter/ ionization air cleaner/ screen filter. Preferably, the robot is capable of autonomous movement. Additionally, the robot includes means for recognizing a targeted person or direction and means for orienting said robot in relation to said targeted person or direction. The robot may further include multiple fans, tubes and vents, wherein the air flow being blown by the multiple fans via the tubes through a body of the robot out through vents towards a targeted direction. In some embodiments, the air filter can be removably positioned within the robot.
According to a preferred embodiment, the robot further includes a means for identifying the targeted direction. This means for identifying the targeted direction includes: a memory module for the robot containing information about potential targets and identifying characteristics of the potential targets; scanners receiving external information about targets in a vicinity of the robot; and, a processing unit analyzing the external information to identify corresponding identifying characteristics of the potential targets and to then direct clean air toward the target.
In some embodiments, the robot may further comprise means for detecting or tracking targets and means for orienting the robot to face a target. The means for detecting or tracking may include a camera image feed and the means for orienting includes inputs to control the operation of the fan.
Brief Description Of The Figures
For a better understanding of the invention and to show how it may be carried into effect, reference will now be made, purely by way of example, to the accompanying Figures, wherewith it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention.
Figures la and lb are front and a side of a head-on and cross-sectional view of the invention as integrated into the body of the intelligent robot.
Figure 2 is a head-on view of the invention integrated into a hollow variation of the robot body Figures 3a and 3b are front and side of a head-on and cross-sec onal view of the invention as integrated into the torso of the intelligent robot.
Figures 4a and 4b are front and side of a first embodiment and Figs. 4c and 4d are front and side of a second embodiment of a filter unit mounted as an external module on the robot.
Figures 5a, 5b, Sc, 5d, Se show front views of various embodiments, showing possible locations on which the fan module can be mounted on the robot Figures 6a and 6B show a cross-section of the main filter unit with UV LEDs, a TIEPA filter, and a UV filter.
There is no Figure 7 -figure number inadvertently skipped.
Figures 8 and 9 show sample possible images taken via the Robot's camera.
Figure 10 shows human bodies detected and tracked in its camera image feed of the Robo Figure 11 is a flow chart, describing the manner in which the device operates.
Figures 12a, 12b and 12c illustrate possible microphone and camera implementations for the device.
Figure 13 shows the logical flow for how the robot uses the image processing results to generate a motor response in which the robot physically turns to face a desired target.
Detailed Description of the Invention
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present invention.
In a basic embodiment, the Invention constitutes a mobile robot with air filters to disinfect local air This invention relates to an intelligent, multi-function robot includes: an housing built into the robot and having a UV blocking material, an air filter positioned within the housing; and a fan directing air flow through the air filter, an intake of the airflow and outflow of said airflow through the air filter being placed to direct clean air directed toward a targeted direction. The air filter may be an UV air cleaner/ ITEPA filter/ ionization air cleaner/ screen filter. Preferably, the robot is capable of autonomous movement. Additionally, the robot includes means for recognizing a targeted person or direction and means for orienting said robot in relation to said targeted person or direction. The robot may further include multiple fans, tubes and vents, wherein the air flow being blown by the multiple fans via the tubes through a body of the robot out through vents towards a targeted direction. In some embodiments, the air filter can be removably positioned within the robot.
According to a preferred embodiment, the robot further includes a means for identifying the targeted direction. This means for identifying the targeted direction includes: a memory module for the robot containing information about potential targets and identifying characteristics of the potential targets; scanners receiving external information about targets in a vicinity of the robot; and, a processing unit analyzing the external information to identify corresponding identifying characteristics of the potential targets and to then direct clean air toward the target.
In some embodiments, the robot may further comprise means for detecting or tracking targets and means for orienting the robot to face a target. The means for detecting or tracking may include a camera image feed and the means for orienting includes inputs to control the operation of the fan.
Figures la and 2b show a "Solid torso configuration" of a head-on and a cross-sectional view of the invention as integrated into the body of the intelligent robot. The filter system 100 is in the base 102 of the robot, and air is blown with the help of multiple fans 104 via tubes 106 that pass through the body108 of the robot out through vents 110 towards the user [not shown].
Figure 2 shows an "Hollow Torso Configuration" of a head-on view of the invention integrated into a hollow variation of the robot body. There is no essential difference between this and the solid torso configuration (Figs. la and 2b), except the shape of the vents 110 and placement of the tubes 106.
Figures 3a and 3b show a "Torso Mounting" of a head-on and a cross-sectional view of the invention as integrated into the torso of the intelligent robot. The invention is thus built into the torso, instead of the base of the robot, which may improve airflow. It may be appreciated that the invention can be integrated into any part of the body of the robot.
Figures 4a and 4b are front and side of a first embodiment and Figs. 4c and 4d are front and side of a second embodiment of a filter unit 400 mounted as an external module 402 on the robot. They show a "Mountable Module" as an example of a filter unit mounted as an external module on the robot. The filter is mounted on a shelf 406 in the middle of the robot's torso 408, and airflow is directed through tubes 410 and out through vents 412, all of which are external to the robot's body. Figures 4a, 4b, 4c, and 4d illustrate the modularity of the unit. The robot can have a built-in vent system and a slot in which the fan module can be inserted and removed, like a cartridge.
Figures 5a, 5b, Sc, 5d, Se show examples of the different locations that the fan module 500 can be mounted on the robot.
The "Filter Module" (Figures 6a and 6b) shows a cross-section of the main filter unit 600 with UV LEDs 602, a HEPA filter 604, a UV filter 606, and the fan 608. The unit contains additional slots for the insertion of additional filters. In figure 6a, the UV LEDs 602 is integrated onto the inside of the casing 610 along rows Figure 6b shows the UV LEDs 602 being integrated into the slot system 612.
In a preferred embodiment, a unit [or module], that comprises an air filter, is attached to the robot. This module may be situated in the base of the robot or along the sides of the robot or as an additional module attached to the robot. is The air filter may preferably comprise at least one of a UV air cleaner/ ITEPA filter/ ionization air cleaner/ screen filter or known air filtering method. If the module comprises a UV air filter, then the UV air filter is housed within a UV blocking material which may be, for example, a UV safe plastic to keep people and the robot safe from negative effects of the UV light. This will allow people to be in the same room as the UV air cleaner. This module can be made to be removeable, so that any filters contained within can be cleaned or replaced when needed which may help for cleaning and repair purposes. The module may have designated slots so additional filters can be added, removed, or changed.
Referring now to Figures 6a and 6b, the air filter 600 comprises one or more fans 608 placed to direct air flow through the air filter. The fan or fans blow air from outside the robot into the filter. Filtered air is blown out of one or more vents towards users, potentially with the help of pipes and additional fans The robot can maneuver to direct the outflow of air from the filter towards the user. The robot can do this as the robot knows how to face the user with the use of image recognition and/or voice recognition. Image recognition (recognition of persons, faces, and position) relates to images being received from one or more cameras or scanner mounted on the robot are sent to a central processing unit (CPU) that runs an algorithm on the data that detects the position of any human body in the images. Depending on the position of the body, the CPU will send commands to the robot's motors to orient the robot towards the body. The position of the vents may also be adjustable, both manually and via electronic command, to meet users' preferences. The above example of image recognition applies to the example of audio recognition (recognition of voice matching, direction of arrival (DOA), and NLP).
The presumed shape and structure of the filter module is a UV blocking casing housing and air filter with openings at opposing ends to allow air flow through the casing. The air filter comprising an outside and an inside where UV producing devices, for example UV LEDs 602, are facing inwards on the inside. There may be multiple rows of UV LEDs all around the inside of the air filter. The UV filter may be lined on the inner side of the air filter casing (placed along rows or columns) or alternatively can be a part of the filter slot system placed on the inner side of the filter slot. An air filter may include a fan to direct airflow through the air filter. This fan can be placed separate from the casing housing the air filter or attached to an open side of the air filter or housed withing the casing housing the air filter. In accordance with invention, the air filter may have a IIEPA filter and /or a screen filter and/or an ionic filter attached to it to filter the air. The filter may have designated slots to enable easy entry and removal of these filters. The air intake can be from a set point that may maximize airflow into the filter and/or placed in an opposing direction to the user. For example, it may be placed in the base close to the floor on the back. The air outflow can be from a set point that may maximize cleaned airflow toward the user, such as horizontally placed alongside the front of the torso The Robot can learn to recognize a set of faces by taking multiple pictures of a user's face and runs an algorithm that extracts key data about the face, such as eyes, lips, and nose position, and stores the data. Images are taken via the Robot's camera and then sent via wired or wireless connection to a central computing unit that runs the algorithms and stores the data. The algorithms may comprise, but are not limited to an initial filtering phase, such as a Haar classifier, which breaks down images to core features like edge and lines [see Figure 8]. if the groupings of the features seen in the image match human facial features, such as eyes, nose, mouth, and chin, then a face is detected. [See Figure 9] Once a face is detected, the CPU performs a more computationally intense analysis where the face is broken down into distinguishable landmarks, which can include, but not limited to, distance between eyebrows, skin color, eye color, nose length, and chin size. These landmarks are then saved to the CPU. This process of facial detection and learning is repeated with multiple users to build a database of known user's faces. As the Robot learns new faces, it can also be taught the specific fan preferences of each user, such as fan speed, distance, UV intensity. When the Robot is later asked whether or not it recognizes an unknown user standing in front of it, it takes pictures of the unknown user's face, uses the above-described facial detection and recognition algorithms to extract the facial landmarks, and compares those landmarks to the known faces stored in the database. This comparison is performed by an Artificial Intelligence algorithm such as a Deep Neural Network. If the unknown facial features any of those known users, the Robot will adjust the fan operation according to that user's preferences.
The Robot can detect and track human bodies in its camera image feed [such as that shown in Figure 12c]. This can be used to orient the robot to face the nearest user and then, preferably, blowing clean air directly in his or her direction. Further, the Robot can also track gestures, such as waving, shoving motions, thumbs up, thumbs down, and "come-here" summoning motion. These motions can be used as inputs to control the operation of the fan. For example, "Come here" will indicate to the Robot that the user wants the Robot to move closer and blow clean air more directly onto the user. A thumbs up can indicate that the user wants the Robot to turn up the fan speed, while a thumbs down can mean turn down the fan speed. A shoving motion can indicate to the Robot to turn away from the user or to back up, images from the Robot's cameras are analyzed by a pose-recognition algorithm that recognizes human figures and returns coordinates (X,Y) or possibly even (X,Y,Z) representing the locations of various key points (such as but not limited to joint locations, center of mass, etc.). [See Figure 101.
One or more of these key points, such as center of mass, is utilized by an algorithm that controls the robot's motors, thereby orienting the robot until the key point's coordinates indicate that human figure is in the center of the Robot's frame and, therefore, that the Robot is facing towards the human figure. For example, if the bottom left corner of the Robot camera frame is at the (X,Y) coordinate (0,0), the top right is at (200,200), and the center of the camera frame is at (100,100), Then, if a person's center of mass is reported at point (150,100), this indicates that the person is to the right of the Robot. The Robot will then send a command for its motors to turn clockwise by activating its left wheel. In consequence, the Robot will turn as such until the center of mass reaches a location that is satisfactorily close to the center of the image frame, like (105,100). The key points are also used to recognize gesture inputs from users. Body gestures can be defined for the Robot according to parameters such as distance and angles between joints. The Robot will use Artificial Intelligence algorithms such as an Artificial Neural Network to determine if the relationships between the key points it sees on screen matches any of the defined gestures. When such a gesture is recognized, the Robot will respond accordingly as detailed above.
The Robot can use audio data to turn towards any user that is speaking] such as the microphone array shown in Figures 12a and 12b]. This is achieved by using a circular array of microphones, each reporting its location on the array. They pick up audio information from the environment and report it together with the location data to a computer processing unit. A filter may or might not be applied to eliminate background noise. The processing unit determines which of the microphones in the array is picking up the loudest sound or picked up the sound first (i.e., closest to the origin of the sound), which indicates that the origin of the sound is in the direction of that microphone. The Robot can then activate its motors to turn towards the origin of the sound until the microphone picking up the loudest audio is that in the center of the array, indicating that the Robot is facing the origin of the sound.
An intelligent robot can use navigation, AI, audio recognition, image recognition etc. to operate. According to the herein disclosed invention, the intelligent robot comprises an air filter and can further operate the air filter using the navigation AT audio recognition, image recognition, etc. to take advantage of the systems there to operate the air filter at greater efficiency. A greater efficiency may mean for example: I Using online weather reports or temperature or air sensors to control the operation state of the air filter 2 Alter the operation of the air filter based on whether there are people present 3 Alter the operation of the air filter based on voice command, remote control, mobile app, gesture control or set schedule of a user wherein the schedule of a user can be known through manual input or Al learning. For example, audio commands ("Turn up the fan", "Turn the fan off', "Turn down the fan", etc.) as well as commands from a mobile application (a sliding bar for fan speed, onloff switch, etc.) All these parameters can be customized by the user for each session or can be left at a default setting.
4 Use face recognition or voice matching (voice matching is when using voice recognition to recognize an individual based on the sound of their voice) to remember people or match up against a registry of known users which can have different access levels to the control of the intelligent robot or a different priority level and act accordingly. An example of a priority user is a high priority being respiratory patients in a hospital setting.
If multiple people are present, the robot can intelligently operate the air filter to focus on a known user or prioritize users as seen in 4, focus on the closest person if the person is significantly distanced from the intelligent robot, flip focus of the air filter between the users in an interval or periodically, or average the placement of the users and focus at the center if the users are close enough together 6 If the air filter further comprises electromechanical equipment like for example a fan and vents then the intelligent robot can operate the electromechanical equipment based on 15, for example operate fan speed accordingly; angle vents accordingly.
7 The intelligent robot may further be able to operate the air filter based on the battery level or charging status. For example, high power operation, normal operation, and power saving operation as well as the basic ON state and OFF state.
Within the robot is a case that encloses an air filter that uses at least one of a known air filter technique like ion filter, screen filter, NEPA filter, UV filter, etc. For the UV filter it must be encased in a UV blocking material like metal or -UV blocking plastic. The air flows through the encasing likely with the help of a fan forcing air flow through the filter. The robot has an input vent to intake air placed somewhere on the robot for example near the filter directed toward the floor or directed toward the back of the robot or at an angle between the floor and the back of the robot. The placement is not significant, but it can be beneficial to have the opening near the filter if possible and it can be beneficial to be directed away from the exhaust vent where the clean air will flow out of as to not "steal" the filtered air that is meant for the user.
The placement of the exhaust vent will be directed toward the front of the robot so, when facing a person, the filtered air will be directed toward said person. Technically, if the robot has a further improvement and cannot just tilt its head, but also pan its head from side to side or even fully rotate the head, then the air vent will not just simply be in the front, but at a set point or angle that is known to the robot prior to operation. For a simple example, if the front of the robot is as an angle theta and the exhaust vent is facing toward the front of the robot, then it can be viewed as 0°, and, if the robot knows its head is facing at an angle away from the front of the robot, then it knows the position of what the robot sees and where its body is positioned in relation to the body and can face the exhaust vent accordingly. For example, if the robot's head is panned 35° away from the front of the robot and it sees a user directly in front of the robot's head, then the robot will need to rotate the robot body 35° in order to face the user. Other known coordinate systems and calculations may be used to achieve similar effects.
At the end of the exhaust vent there can be a simple output opening or a manually controlled or electrically controlled or both manually and electrically controlled directional air flow controller controlling the air flow vertically, horizontally, or both vertically and horizontally.
The air filter can be placed to be easily removable from the robot for cleaning and repair purposes.
Alternatively, the air filter can be an individual module separate from the main body of the robot and attaching to the robot at a set point or a number of set points. The set point or number of set points may comprise an electrical connection for power, data, or power and data to connect the air filter module to the robot. If the air filter is a module, it may comprise all the components on its own or have some components on the main body of the robot being partially integrated onto the robot.
A modular constructed robot may include an air filter module. This module may completely self-contain the air filter from air flow input to air flow output, regardless of the other modules making up the modular robot. It may also be possible to have the air filter partially integrated into another module. For example, there can be a torso module in the robot, comprising a vent system used by the air filter, with a connection point for an air filter module comprising the air filter's core components (such as, the casing, the filter(s), the vents, pipes). The air filter module can be attached or removed from the torso module in a cartridge like system on the modular robot or on the torso itself The robot can thus function with or without the air filter module attached.
According to the broadest concept of the invention, the core component of the filter may be defined as the filter module itself Other components, like the casing, the filters, the fan, the pipes and vents, may not be core components per se, but as preferable or desirable components.
The "cartridge like system" can be thought of as analogous to a radio cassette player.
* The radio cassette player is analogous to the intelligent robot * The radio is analogous to other non-air filter features the robot provides * The cassette player is analogous to the connection components to the air filter o For example, where the air filter in partly integrated into the robot-like having vents and pipes built in o For example, a module connection point to receive and operate the air filter module * A Cassette is analogous to the air filter or air filter's core components * A cassette being able to play different music is further analogous by the air filter being able to be attached to the robot and still comprise different internal components like screen filter and/or HEPA filter and/or ion filter and or UV filter etc. each component individually can have variations in quality and grade thus there can be a plethora of different filters like there are a plethora of' cassettes with different music and combinations of songs on it * Radio cassette player with a cassette inside is analogous to an intelligent robot comprising a modular air filter that is able to provide the additional air filter feature.
The radio cassette player can operate the radio just fine with or without a cassette inside as well as still has all the functionality to play a cassette. But, when a cassette is in the radio cassette player, the radio can in addition play the cassette. A further analogy is the fact that cassettes can play different music.
Referring now to Figure I I, when the robot is activated and the system turns on, the robot will check the air filter default settings or customized default settings 101. 'the default settings may be congruent with the robot air filter operation logic. These settings may include, for example, the air filter being on at all times, a set battery level, or a charge condition to turn on or off according to a schedule on when to be on or when to be off, etc. After the default settings are checked, the robot will make further checks if the current status of the robot warrants a response 102. This can include for example, without limitation, internal statuses and external statuses. Examples of current status checks may be the following or a change in status of the following: current battery status, charging status, weather status, person present in robot's vicinity, multiple persons present in the robot's vicinity, command given to robot, a set of time allotted past, a signal from the robot's Al, a scheduled event occurs, default setting have changed, default setting conditions were triggered, image recognition status, voice recognition status, robot person interaction status, robot current activities and current priority list of activities status, etc. Any other relevant status can also be monitored.
The current status default checks 102 may be further broken down into simple current status checks 102a and complex status checks 102b and may comprise even further levels, dividing the checks on computational means like time, CPU usage, memory, etc. This may be done to allow the robot to preform simple checks first before the complex checks. For example, if the robot's battery is under a set level, for example 2%, then the robot may be set to operate the air filter in the off mode and no longer requires more complex checks to determine the status of the air filter.
The logic to determine if the results of the checks warrant a response is determined by the default settlings or the customized default settings.
There are 3 archetypal responses 103 that the robot may choose: do nothing 103a (do not make any changes), change mode 103b, or adjust within the current mode I 03c. If the robot's response 103 is do nothing I03a, the air filter will remain in the current state and current mode of operation. When the robot's response 103 is changing mode I 03b, the robot will change the Robot's operation of air filter mode. For example, changing mode 103b, switching from "Midpoint Mode" to -Priority User Mode" will cause the Robot to switch from directing the air filter towards the midpoint of detected users to focusing exclusively on a high-priority user. If the robot's response 103 is adjust within the current mode 103c, the robot will adjust the operation of the air filter within its current mode -it will follow the specific behavior protocol dictated by that mode. For example, if while the robot is in -Midpoint Mode," a check detects that a user has moved, the robot will adjust (without changing mode) by calculating a new midpoint between the users it sees and moving to face towards the new midpoint.
The operation mode of the air filter comprises two general modes, Off mode 103ba and On mode 103bb. Off mode 103ba comprises the mode where the air filter is in the off state and On mode 103bb comprises the mode where the air filter is in the on state. Within On mode 103bb, there are several sub-modes that can further delineate the behavior pattern of the robot. For example, this may include operation modes like passive mode 103bba, single user mode 103bbb, and multi-user mode 103bbc. Where the operation in passive mode may comprise the air filter in an on state passively active, but not directly targeting a user. Where the operation is in single user mode 103bbb, the robot may notice a single user and target the air filter at the single user. When the operation is in multi-user mode 103bbc, the robot air filter may notice multiple users and target the air filter toward at least one of the users. Each of these modes, passive mode 103bba, single user mode 103bbb, and multi-user mode 103bbc, may in part each have further sub-modes. For example, the multi-user mode 103bbc mode may operate in a number of sub-modes which may comprise, but not limited to, a midpoint mode, a closest person mode, a priority user mode, an interval switching mode, an oscillatory mode, a passive mode, a largest coverage mode. Wherein, for example, midpoint mode denotes that, when the robot detects multiple people, it should face a midpoint between the users. Closest person mode denotes that, when the robot detects multiple people, it should only turn to face towards the closest person. Priority user mode denotes that, when the robot detects multiple people, it should ignore all humans and only face towards a specific person deemed a priority user. Interval switching user mode denotes that, when the robot detects multiple people, it should face each user for a set or variable time interval and switch between them. Oscillatory mode denotes that, when the robot detects multiple people, it should move the air filter in an oscillatory manner spanning the coverage of the air filter to the users. Passive mode denotes that, when the robot detects multiple people, it should passively be in the on mode in the vicinity of the users. Large coverage mode denotes that, when the robot detects multiple people, it should face in a general direction of the largest number of users to cover the largest number of users. Some of these sub-mode's principles may be altered in some way and may be altered to be operational as subsets for the single user mode 103bbb where applicable.
It may be appreciated that the above are just illustrative examples of possible operating modes. Any other suitable mode may be created and utilized.
The protocol for if or when to switch between such modes are all included in the initial settings and can be pre-programmed by the development team, customized for specific user requirements, operated through artificial intelligence, or a combination of any of them. For example, the robot may ship without any default setting to switch into the "Priority User Mode". However, a particular doctor working in the robot's facility may decide that he or she wants the robot to recognize him or her as a priority user. The doctor can modify the robot's settings, so that whenever the results of the robot's status checks show that this doctor is detected, the robot will switch out of its current mode, enter "Priority User Mode", and turn to face the doctor. Thereafter, the robot will stay in this mode until checking results indicate, according to the robot's default settings, that the robot should switch to a different mode. As there are many sub-modes in 103 the logic for when and how to switch between modes can be quite complex, but it will all be performed according to the robot's settings and check results. An example of a combination of pre-programmed by the development team and user customized is if the air filter is pre-programmed by the development team to change to the off mode if the battery is below a set percentage range, say 0-25% battery, and user customizable may be within the range of 525%, allowing the user to set the feature as low as 5% or as high as 25%.
Figure 13 shows the logical flow for how the robot uses the image processing results to generate a motor response in which the robot physically turns to face a desired target. The image processing results generate a motor response in which the robot physically moves to face a desired target. Initially, the intelligent robot scans for an image 401, receives image data and may then format the image data to prepare for image processing 402. The robot checks for any known objects or persons (steps 404 and 405) that it can recognize in the image data 403. If an object is not detected, the intelligent robot may, for example, have no change in status or switch to the OFF mode. When the robot does detect an object, the robot will determine if that object is an object of target value, for example the object being a person, pet, etc. The robot may calculate (step 406) the position of the object of target value and set that value as a marker. This marker may be a target or used to calculate a target or set of targets or area of targets. Thereafter, the intelligent robot calculates the required motor movement (step 407) and then sends a command, based on the calculated required motor movement, to move toward or angle toward or both move toward and angle toward a target, set of targets or area of targets. At this point (step 408), the motor then executes the command and moves the intelligent robot to face toward a target, set of targets or area of targets.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (10)

  1. We Claim: 1. An intelligent, multi-function robot comprising: An housing built into said robot and having a UV blocking material; An air filter positioned within said housing; and a fan directing air flow through the air filter, an intake of said airflow and outflow of said airflow through said air filter being placed to direct clean air directed toward a targeted direction.
  2. 2. An intelligent, multi-function robot according to Claim I, wherein said air filter being UV air cleaner/ HEPA filter/ ionization air cleaner/ screen filter.
  3. 3. An intelligent, general function robot according to Claim 1 wherein said robot is capable of autonomous movement.
  4. 4. An intelligent, general function robot according to Claim I wherein said robot having means for recognizing a targeted person or direction and means for orienting said robot in relation to said targeted person or direction.
  5. 5. An intelligent, general function robot according to Claim I wherein said robot further comprising multiple fans, tubes and vents, wherein said air flow being blown by said multiple fans via said tubes through a body of the robot out through vents towards said targeted direction.
  6. 6. An intelligent, multi-function robot according to Claim 1, wherein said air filter being removably positioned within said robot.
  7. 7. An intelligent, general function robot according to Claim 1 wherein said robot further comprising means for identifying said targeted direction.
  8. 8. An intelligent, general function robot according to Claim 7 wherein said means for identifying said targeted direction comprising: a memory module for said robot containing information about potential targets and identifying characteristics of said potential targets; scanners receiving external information about targets in a vicinity of said robot; and, a processing unit analyzing said external information to identify corresponding identifying characteristics of said potential targets and to then direct clean air toward said target.
  9. 9. An intelligent, general function robot according to Claim I wherein said robot further comprising means for detecting or tracking targets and means for orienting said robot to face a target.
  10. 10. An intelligent, general function robot according to Claim 9 wherein said means for detecting or tracking comprising a camera image feed and said means for orienting comprising inputs to control the operation of said fan
GB2206738.3A 2022-02-24 2022-05-09 Robot air filter Active GB2604781B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263313317P 2022-02-24 2022-02-24
US202263327836P 2022-04-06 2022-04-06

Publications (2)

Publication Number Publication Date
GB2604781A true GB2604781A (en) 2022-09-14
GB2604781B GB2604781B (en) 2023-04-19

Family

ID=82898732

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2206738.3A Active GB2604781B (en) 2022-02-24 2022-05-09 Robot air filter

Country Status (1)

Country Link
GB (1) GB2604781B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4345391A1 (en) * 2022-09-30 2024-04-03 HDHyundai Robotics Co., Ltd. Air cleaning and disinfection robot and method of operating the same

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106403106A (en) * 2016-09-12 2017-02-15 成都创慧科达科技有限公司 Multifunctional domestic robot
CN206080397U (en) * 2016-06-14 2017-04-12 诺曼利尔(青岛)环境能源技术有限公司 Air purification disappears and kills robot
KR20170052743A (en) * 2015-11-03 2017-05-15 에이블비 주식회사 An air cleaner
CN214074367U (en) * 2020-12-23 2021-08-31 斯坦德机器人(深圳)有限公司 Full-automatic disinfection robot
WO2021194023A1 (en) * 2020-03-26 2021-09-30 주식회사 제타뱅크 Mobile robot and control method therefor
CN114043489A (en) * 2021-09-27 2022-02-15 江苏格姆思体育科技研究院有限公司 Robot with washing air purification function
CN216159257U (en) * 2021-08-26 2022-04-01 杨晖 Integrated cavity structure disinfection robot

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10704799B2 (en) * 2016-05-23 2020-07-07 Hanon Systems Moveable air conditioner
KR102034629B1 (en) * 2018-01-11 2019-11-08 엘지전자 주식회사 Mobile indoor unit and air conditioning system including the same

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170052743A (en) * 2015-11-03 2017-05-15 에이블비 주식회사 An air cleaner
CN206080397U (en) * 2016-06-14 2017-04-12 诺曼利尔(青岛)环境能源技术有限公司 Air purification disappears and kills robot
CN106403106A (en) * 2016-09-12 2017-02-15 成都创慧科达科技有限公司 Multifunctional domestic robot
WO2021194023A1 (en) * 2020-03-26 2021-09-30 주식회사 제타뱅크 Mobile robot and control method therefor
CN214074367U (en) * 2020-12-23 2021-08-31 斯坦德机器人(深圳)有限公司 Full-automatic disinfection robot
CN216159257U (en) * 2021-08-26 2022-04-01 杨晖 Integrated cavity structure disinfection robot
CN114043489A (en) * 2021-09-27 2022-02-15 江苏格姆思体育科技研究院有限公司 Robot with washing air purification function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4345391A1 (en) * 2022-09-30 2024-04-03 HDHyundai Robotics Co., Ltd. Air cleaning and disinfection robot and method of operating the same

Also Published As

Publication number Publication date
GB2604781B (en) 2023-04-19

Similar Documents

Publication Publication Date Title
US11571648B2 (en) Air cleaner
US11957807B2 (en) Cleaning robot
CN1198244C (en) Control method
WO2013121653A1 (en) Free-running electronic device
CN102892438B (en) Germicidal bulb disinfection base for ophthalmic lens
GB2604781A (en) Robot air filter
KR20190098101A (en) Air conditioner
US11779676B2 (en) Sanitizing device
CN102892436A (en) Light emitting diode disinfection base for ophthalmic lenses
US11684691B2 (en) Personal sanitizing device
US11679176B2 (en) Method and system for LED based virus and bacteria removal
US11872512B2 (en) Robot air filter
KR102001782B1 (en) Method of controlling air-cleaner using artificial intelligence based on input sound and air-cleaner implementing thereof
US20210290814A1 (en) Methods and systems for sterilizing spaces or surfaces from stand-off distances
CN112466444B (en) Comprehensive management system for protective articles
US20220031881A1 (en) Ultraviolet disinfector and related methods for disinfecting articles
Bergasa et al. Guidance of a wheelchair for handicapped people by face tracking
Bhati et al. CAPture: a vision assistive cap for people with visual impairment
JP7118456B2 (en) Neck device
CN112291319A (en) Method for simulating thinking and intelligent equipment
Ghazal et al. Localized assistive scene understanding using deep learning and the IoT
KR102488483B1 (en) Disinfection robot interacting with humans based on safety distance
US20240115754A1 (en) Ultra light biological satellite mask removable from and/or mateable to mechanical, chemical, and/or nuclear host mask
JP7008356B2 (en) Mobile terminal device
Yuliyanto et al. Face And Mouth Openness Detection on Visual Servoing Robot Using Haar-cascade and Adaptive Boosting