CN114190823B - Intelligent household robot and control method - Google Patents

Intelligent household robot and control method Download PDF

Info

Publication number
CN114190823B
CN114190823B CN202111231155.2A CN202111231155A CN114190823B CN 114190823 B CN114190823 B CN 114190823B CN 202111231155 A CN202111231155 A CN 202111231155A CN 114190823 B CN114190823 B CN 114190823B
Authority
CN
China
Prior art keywords
user
module
robot
main control
control module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111231155.2A
Other languages
Chinese (zh)
Other versions
CN114190823A (en
Inventor
蒋少华
吴早阳
贾凤娇
梁易高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Normal University
Original Assignee
Hunan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Normal University filed Critical Hunan Normal University
Priority to CN202111231155.2A priority Critical patent/CN114190823B/en
Publication of CN114190823A publication Critical patent/CN114190823A/en
Application granted granted Critical
Publication of CN114190823B publication Critical patent/CN114190823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The invention relates to the technical field of robots, and discloses an intelligent household robot and a control method. The main control module is respectively connected with the first sensor module and the executing mechanism; the first sensor module is used for detecting the component content of the gas in the environment and sending the component content to the main control module; the main control module is used for analyzing the content of the gas components, and sending alarm information to the APP of the mobile terminal through the communication module connected with the Internet under the condition that the content of the gas components exceeds a set threshold value; and the executing mechanism is used for running to a device generating gas components in the environment under the control of the APP of the mobile terminal or the control of the main control module, and executing closing operation to control the device to stop generating the gas components. The intelligent household robot provided by the invention has multiple functions and is convenient to remotely control.

Description

Intelligent household robot and control method
Technical Field
The invention relates to the technical field of robots, in particular to an intelligent household robot and a control method.
Background
In recent years, with the rapid development of sensor technology, image processing technology, voice processing technology, automatic control technology, wireless network technology, and intelligent technology, intelligent robots have also gradually entered people's lives. And becomes one of the most developed industries in the future, and has wide market prospect.
The existing household robot is single in function, lacks generality, is weak in man-machine interaction capability, and is more similar to a machine rather than a robot, such as a sweeping robot; the other type has the form of a person and relatively good man-machine interaction capability, but has weak decision making capability, and still has single function, such as a robot NAO manufactured in France or a robot Pepper manufactured in Japan, and has high man-machine communication degree, but is not suitable for household use. Some greeting robots also have the appearance of a person, can review some data and interact with children to a certain extent, but still lack the function suitable for household practicality, and cannot meet the household living needs.
Disclosure of Invention
The invention provides an intelligent household robot and a control method thereof, which are used for solving the problems in the prior art.
In order to achieve the above object, the present invention is realized by the following technical scheme:
in a first aspect, an embodiment of the present application provides an intelligent home robot, including a first sensor module, a main control module, and an actuator. The main control module is respectively connected with the first sensor module and the executing mechanism;
the first sensor module is used for detecting the content of gas components in the environment and sending the content of the components to the main control module;
the main control module is used for analyzing the content of the gas component, and when the content of the gas component exceeds a set threshold value, the main control module is connected with the Internet through the communication module to send alarm information to the APP of the mobile terminal;
the executing mechanism is used for running to a device generating the gas component in the environment under the control of the APP of the mobile terminal or the control of the main control module, and executing closing operation to control the device to stop generating the gas component.
Optionally, the actuator comprises a driving wheel and a mechanical arm;
the driving wheel is used for driving the robot to travel to a device generating the gas component in the environment;
and the mechanical arm is used for executing closing operation under the control of the APP of the mobile terminal or the control of the main control module so as to control the device to stop generating the gas component.
Optionally, the system also comprises an audio acquisition module and a voice recognition module connected with the audio acquisition module,
the audio acquisition module is used for acquiring audio data of the environment where the robot is located;
the voice recognition module is used for analyzing and recognizing the audio data.
Optionally, the system further comprises a visible light and infrared video acquisition module, a face recognition module, a gesture recognition module and a communication module, wherein the visible light and infrared video acquisition module is respectively connected with the face recognition module and the gesture recognition module;
the visible light and infrared video acquisition module is used for acquiring video information of the environment where the robot is located;
the face recognition module is used for analyzing the collected video information and recognizing the face information in the video information;
the gesture recognition module is used for analyzing the collected video information and recognizing gesture information of people in the video information, such as limb actions including falling, sitting, lying, standing, walking, gesture information and the like;
the main control module is also used for outputting corresponding operation instructions according to the face information and the gesture information; the corresponding operation instructions comprise following, accompanying, supporting, closing and opening the mechanical parts, switching on and off, controlling the executing mechanism to make a call and triggering an alarm.
Optionally, the system also comprises a second sensor module, a remote control module, a biological characteristic sensor, a built-in database, a display module, a loudspeaker and a positioning module which are connected with the main control module,
the second sensor module is used for detecting environmental temperature and humidity information in the house according to user setting or remote instructions;
the remote control module is used for sending a remote control command to the intelligent household appliance according to a control instruction of the mobile terminal APP or a decision result of the main control module;
the built-in database is used for storing various parameters, index data, state data and other data required by system operation set by a user;
the display module is used for completing the display and output of related information under the control of the main control module;
the loudspeaker is used for outputting voice information under the control of the main control module;
the positioning module is used for acquiring the position information of the robot under the control of the main control module.
Optionally, the system also comprises a photosensitive sensor and a distance sensor which are connected with the main control module;
the photosensitive sensor is used for detecting indoor brightness;
the distance sensor is used for measuring the distance between the mechanical arm of the robot and the mechanical arm operation object and measuring the distance between the robot and a user following the accompanying person when the robot follows the accompanying person.
Optionally, the system also comprises a laser radar, a gyroscope and an accelerometer which are connected with the main control module,
the laser radar is used for acquiring environment information of the robot, wherein the environment information comprises wall information, door and window information and barrier information in the advancing process;
the gyroscope is used for measuring the direction of the robot in the running process and sensing in the horizontal and vertical directions;
the accelerometer is used for measuring the inclination angle of the equipment relative to the horizontal plane and acquiring the acceleration values of the robot in the directions of x, y and z respectively.
Optionally, the remote control module comprises a bluetooth module, a radio frequency module and an infrared emission receiving module;
the Bluetooth module is used for realizing wireless Bluetooth communication, positioning and control under the control of the main control module;
the radio frequency module is used for carrying out radio frequency communication with part of devices in the environment;
the infrared transmitting and receiving module is used for communicating with the intelligent household appliance under the control of the main control module.
In a second aspect, an embodiment of the present application provides a control method of an intelligent home robot, which is applied to the intelligent home robot according to the first aspect, including the following steps:
the first sensor module detects the composition and content of the gas components in the environment and sends the composition and content to the main control module;
the main control module analyzes the composition and the content of the gas components, and sends alarm information to the APP of the mobile terminal through the Internet by the communication module under the condition that the content of the specific gas components exceeds a set threshold value. The user can operate on the mobile APP, the mobile APP sends instructions to the robot through the Internet, and the execution mechanism is controlled to execute specific actions, such as actions of forward, backward, steering, movement of the mechanical arm, grasping and the like of the robot;
and the executing mechanism runs to a device generating the gas component in the environment under the control of the APP of the mobile terminal or the decision control of the main control module, and executes closing operation to control the device to stop generating the gas component.
Optionally, the intelligent home robot further comprises a built-in database, and the method further comprises:
acquiring a user physical state index through a biosensor, or reading the user physical state index input by a user, and storing the user physical state index and the built-in database;
the system takes the physical state index as the input of an algorithm, generates a personalized recipe through the calculation of the algorithm, and pushes the personalized recipe to a user.
The beneficial effects are that:
the invention provides an intelligent household robot which comprises a first sensor module, a main control module and an executing mechanism. The main control module is respectively connected with the first sensor module and the executing mechanism; the first sensor module is used for detecting the component content of the gas in the environment and sending the component content to the main control module; the main control module is used for analyzing the content of the gas components, and sending alarm information to the APP of the mobile terminal through the communication module connected with the Internet under the condition that the content of the gas components exceeds a set threshold value; and the executing mechanism is used for running to a device generating gas components in the environment under the control of the APP of the mobile terminal or the control of the main control module, and executing closing operation to control the device to stop generating the gas components. The intelligent household robot provided by the invention has multiple functions and is convenient to remotely control; the intelligent household appliance and the traditional non-intelligent equipment can realize remote control, automatic alarm, accompanying and other functions, and various wireless control modes, loops with data feedback, recipe pushing functions and the like are introduced.
Drawings
FIG. 1 is a schematic diagram of an intelligent home robot according to a preferred embodiment of the present invention;
FIG. 2 is a second schematic diagram of the intelligent home robot according to the preferred embodiment of the present invention;
FIG. 3 is a third schematic diagram of the intelligent home robot according to the preferred embodiment of the present invention;
FIG. 4 is a flow chart of an embodiment of an intelligent anti-theft control method for an intelligent home robot according to a preferred embodiment of the present invention;
FIG. 5 is a flowchart of an embodiment of a recipe recommendation control method for an intelligent home robot according to a preferred embodiment of the present invention;
fig. 6 is a flowchart illustrating an embodiment of a travel mode control method of an intelligent home robot according to a preferred embodiment of the present invention.
Detailed Description
The following description of the present invention will be made clearly and fully, and it is apparent that the embodiments described are only some, but not all, of the embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-3, an embodiment of the present application provides an intelligent home robot, which includes a first sensor module, a main control module, and an executing mechanism, wherein the main control module is connected with the first sensor module and the executing mechanism respectively;
the first sensor module is used for detecting the component content of the gas in the environment and sending the component content to the main control module;
the main control module is used for analyzing the content of the gas components, and sending alarm information to the APP of the mobile terminal through the communication module connected with the Internet under the condition that the content of the gas components exceeds a set threshold value;
and the executing mechanism is used for running to a device generating gas components in the environment under the control of the APP of the mobile terminal or the control of the main control module, and executing closing operation to control the device to stop generating the gas components.
In this embodiment, the main control module is electrically connected to the first sensor module, the voice module, and the execution module, and is configured to monitor the designated area according to a user setting or a user instruction, and calculate, according to the gas component, the content, the temperature, and the humidity parameters of the flammable gas component, such as methane, carbon monoxide, and CO detected by the first sensor module, to generate a decision by the main control module, and control the execution module to close the stove switch.
As an alternative scheme, the main control module analyzes the gas component monitored by the first sensor module, if the content of the specific component exceeds a threshold value, an alarm is given on the APP through a network, a user controls the execution part to control the robot to advance or turn to drive to a target position on the APP, and the APP controls the execution part to control the mechanical arm to move and open and close to complete specific closing operation.
As a further alternative, the robot may also actively patrol, for example, collect environmental data through vision or other sensors, and then after being analyzed by the main control module, find abnormality, feedback an alarm to the user on APP through the communication module, internet, and then respond by the user or take action autonomously by the machine according to the degree of emergency to respond or intervene.
As a further alternative, the user may send the instruction via APP, transmit it via Internet, the communication module of the robot receives it, the master control module parses the instruction, triggers the robot to travel to a specific place, collects surrounding data, analyzes it via the master control module, autonomously decides to take a specific action or sends it via Internet to APP, which is decided by the user and sends a response instruction on APP.
Optionally, the actuator comprises a drive wheel and a mechanical arm;
a driving wheel for driving the robot to travel to a device generating a gas component in the environment;
and the mechanical arm is used for executing closing operation under the control of the APP of the mobile terminal or the control of the main control module so as to control the device to stop generating the gas component.
It should be explained that the robot as a whole includes a sensor module, a main control module, an execution unit, etc., and when the driving wheel travels, the whole robot travels together to the destination, and the mechanical arm naturally travels to the destination. The driving wheel is used for driving the robot to travel to a specific target area; the mechanical arm can be used for executing specific arm movement, opening and closing operations of the mechanical arm and the like so as to finish specific execution tasks, such as closing a kitchen range switch, a tap and other switches.
The control instruction of the main control module can be from the autonomous decision of an intelligent algorithm built in the main control module, or can be that a user sends the instruction through a remote APP, and the instruction is transmitted through the Internet, received by a communication module of the robot and sent by the main control module.
Optionally, the system also comprises an audio acquisition module and a voice recognition module connected with the audio acquisition module,
the audio acquisition module is used for acquiring audio data of the environment where the robot is located;
and the voice recognition module is used for analyzing and recognizing the audio data.
In this optional embodiment, the voice recognition module is connected to the audio collection module, and is configured to analyze and recognize audio data collected by the audio collection module.
Optionally, the system further comprises a visible light and infrared video acquisition module, a face recognition module, a gesture recognition module and a communication module, wherein the visible light and infrared video acquisition module is respectively connected with the face recognition module and the gesture recognition module;
the visible light and infrared video acquisition module is used for acquiring video information of the environment where the robot is located;
the face recognition module is used for analyzing the collected video information and recognizing face information in the video information;
the gesture recognition module is used for analyzing the collected video information and recognizing gesture information of people in the video information, such as limb actions including falling, sitting, lying, standing, walking, gesture information and the like;
the main control module is also used for outputting corresponding operation instructions according to the face information and the gesture information; the corresponding operation instructions comprise following, accompanying, supporting, closing, opening and setting mechanical parts, switching on and off, controlling an executing mechanism to make a call and triggering an alarm.
In this alternative embodiment, following refers to analyzing the action track (travel route) of a particular following object and selecting the closest identical action track (or form route), and accompanying the daemon, monitoring the particular accompanying object, and interacting (responding) with the accompanying object in voice, video, or other action. The closing may be closing a faucet, a valve, a touch switch, a door, a pull switch, etc. The responsive decision action is issued by the user through the remote APP or generated by the robot master control module through a computational decision. Opening is opposite to closing for opening a faucet, a valve, a touch switch, a gas range switch, a door, etc.
As an alternative scheme, when following accompanying, the user can send out interactive instructions at the APP end, control the robot to advance and retreat, and the movement of the mechanical arm realizes supporting and helping. Accidental injury to the user due to accident can be avoided.
Optionally, the system also comprises a second sensor module, a remote control module, a biological characteristic sensor, a built-in database, a display module, a loudspeaker and a positioning module which are connected with the main control module,
the second sensor module is used for detecting environmental temperature and humidity information in the house according to user setting or remote instructions;
the remote control module is used for sending a remote control command to the intelligent household appliance according to a control command of the mobile terminal APP or a decision result of the main control module;
the built-in database is used for storing various parameters, index data, state data and other data required by system operation set by a user;
the display module is used for completing the display and output of related information under the control of the main control module;
the loudspeaker is used for outputting voice information under the control of the main control module;
and the positioning module is used for acquiring the position information of the robot under the control of the main control module.
In the optional implementation mode, the biological characteristic sensor is connected with the main control module, and under the control of the main control module, various body posture image feature sets of the family members, various body detection indexes of the family members and personal data of the family members are collected; the built-in database is connected with the main control module, and stores various body health state data of family members, user remote instructions, facial feature data of family members and known visitors, family map data, operation logs and other parameters and data required by system operation under the control of the main control module; the display module is connected with the main control module and is used for displaying the state information of the equipment, the output information of the robot and the output information of the third party application under the control of the main control module.
In one example, the built-in database of the present invention is a collection of multiple data tables, and may be further classified into a system parameter database, a user health database, a recipe database, etc. according to the relationships or functions between the data tables. Where "database" refers to a collection of one or more interrelated data tables, the relationship between such data tables may be direct or indirect. As a preferred option, part of the frequently accessed small-scale data may be stored in the robot, referred to as built-in data (library) or local data (library), to distinguish between large-scale, less frequently accessed data (library) stored on the server. It is also possible, as an alternative or in addition, to combine, split, unify or store these data (libraries) in some way on a robot local memory or server. Where it is not necessary to strictly distinguish between local and server storage, it may be collectively referred to as a database.
The database can at least further comprise a common disease physical state index database, a user health record database, a recipe database, a diet preference database and a configuration database, wherein: the common disease physical state index database is used for storing possible variation ranges of various physical state indexes (such as heartbeat, electrocardio, blood oxygen, skin electricity, blood sugar and the like) of various common diseases and corresponding recipe recommendations (recommending specific recipes) and recommended diet indexes such as low sugar, low salt, low sodium, low fat, high protein, low protein, multivitamin, multicellulose and the like; and contraindication requirements corresponding to common diseases, such as multiple oil contraindication, multiple salt contraindication, seafood contraindication, and contraindication of certain diet combination; the user health record database records user health records, various physical state indexes, log records obtained by the physical state indexes, recipe log records selected by a user and the like; recipe databases record the ingredients of each recipe, and the method includes food attribute labels such as polysaccharide, multi-oil, low-sugar, low-sodium, low-fat and the like; the diet preference database records tags such as usual diet style preferences of users, when a menu is selected by the users, the tags are automatically matched with the cooking style preferences and are displayed when the recipes are recommended, and the configuration database records various parameters configured by the users, system working parameters and the like.
Optionally, the system also comprises a photosensitive sensor and a distance sensor which are connected with the main control module;
a photosensor for detecting the brightness in the room;
and the distance sensor is used for measuring the distance between the mechanical arm of the robot and the mechanical arm operation object and measuring the distance between the robot and the user following the accompanying person when the robot follows the accompanying person.
In this alternative embodiment, the distance sensor measures the distance between the robot arm and the robot arm operation object (e.g., a switch), and measures the distance between the robot and the user following the companion when the companion is following.
Optionally, the system also comprises a laser radar, a gyroscope and an accelerometer which are connected with the main control module,
the laser radar is used for acquiring environment information of the robot, wherein the environment information comprises wall information, door and window information and barrier information in the advancing process;
the gyroscope is used for measuring the direction in the running process of the robot and sensing in the horizontal and vertical directions;
and the accelerometer is used for measuring the inclination angle of the equipment relative to the horizontal plane and acquiring the acceleration values of the robot in the directions of x, y and z respectively.
In this alternative embodiment, the lidar senses the surrounding environment of the robot, such as the morphology of objects present near the robot, distance, altitude, speed, morphology of nearby objects, etc.; the gyroscope measures the direction, realizes calibration of the robot in horizontal, vertical and other directions, and provides basis for keeping balance of the robot.
Optionally, the remote control module comprises a bluetooth module, a radio frequency module and an infrared emission receiving module;
the Bluetooth module is used for realizing wireless Bluetooth communication, positioning and control under the control of the main control module;
the radio frequency module is used for carrying out radio frequency communication with part of devices in the environment;
and the infrared transmitting and receiving module is used for communicating with the intelligent household electrical appliance under the control of the main control module.
In the optional implementation mode, the Bluetooth module realizes short-distance wireless Bluetooth communication, positioning and control under the control of the main control module; the radio frequency module is used as an optional configuration for communication, ranging and the like of certain devices; and the infrared transmitting and receiving module is used for communicating with the intelligent household electrical appliance under the control of the main control module, such as sending a remote control command and receiving infrared data.
The embodiment of the application also provides a control method of the intelligent household robot, which is applied to the intelligent household robot and comprises the following steps:
the first sensor module detects the composition and content of the gas components in the environment and sends the composition and content to the main control module;
the main control module analyzes the composition and content of the gas components, and sends alarm information to the APP of the mobile terminal through the Internet under the condition that the content of the specific gas components exceeds a set threshold value, a user can operate on the mobile APP, the mobile APP sends instructions to the robot through the Internet, and the execution mechanism is controlled to execute specific actions such as forward, backward, steering, movement of the mechanical arm, grasping and the like of the robot;
the executing mechanism runs to a device generating gas components in the environment under the control of APP of the mobile terminal or the decision control of the main control module, and executes closing operation to control the device to stop generating the gas components.
Optionally, the intelligent home robot further comprises a built-in database, and the method further comprises:
acquiring a user physical state index through a biosensor, or reading the user physical state index input by a user, and storing the user physical state index and the user physical state index in a built-in database;
the system takes the physical state index as the input of the algorithm, generates personalized recipes through the calculation of the algorithm, and pushes the personalized recipes to the user.
In this embodiment, the recipe recommendation function may specifically include the following steps:
step one: the biological sensor obtains physical state indexes of a user, such as health indexes of heartbeat, electrocardio, skin electricity, blood sugar, blood oxygen and the like;
step two: the system matches the common disease physical state index database in the database according to the obtained user health index, obtains N recipes recommended by the system according to the matching result (N is a natural number and is stored in the database as a configurable parameter), and records the recommended diet index corresponding to the user, the recipe record selected by the user and the user health file;
step three: the physical condition index data of the user are recorded as follows:
A={a 1 ,a 2 …a n };
the user geographical location is recorded as follows:
l= { administrative pressure stroke L 1 Average price of housing in district 2 Regional taste preference/ 3 …}:
The user input diet preference F and the like construct user initial portrait data as follows:
D={d 1 ,d 2 ,…d m };
wherein:
D=A∪L∪F;
step four: calculating the similarity of the initial image data between users to obtain a similarity matrix between the users;
Figure BDA0003314995200000081
wherein: s is S ij Representing the similarity of the initial image data of user i and user j. Calculating k similar user sets for user i such as:
U(i)={u r (i)},r=1,2,...k,k<n,u r (i)∈{s it t=1.2, … n, and satisfies s i1 ≥s i2 ≥s i3 …≥s in
Step five: analyzing the recipe record selected by the user i, putting an abnormal food list with very low user selection frequency (lower than a threshold V) into a set 0 (i), and putting a common menu with high user selection frequency (higher than the threshold V) into a set C (i); the union of all user j selection frequency high employing menu set C (j) similar to user i is as follows:
CR(i)=C(1)∪C(2)…∪C(k);
wherein j ε U (i);
and ordering from high to low according to the frequency; taking:
OR(i)=O(1)∪O(2)…∪O(k):
step six: the system randomly extracts a recommended recipe from CR (i) according to the probability of 95%, and extracts the recommended recipe from OR (i) according to the probability of 5%, and recommends the recommended recipe to a user i; the user i selects a proper recipe from the recommended recipes, the system records the selection history of the user, and prompt information such as food processing and making notes is added according to the diet style preference recorded by the user;
step seven: the system mines trends in the user's selection of recipes from the recommended recipes, such as consumption levels, dietary preferences, etc., and adds adjustments to subsequent recommendations.
The recommendation algorithm may also be varied and adjusted when the patent is implemented, including but not limited to recommending other recipes based on the similarity of food description tags; recommending similar recipes and the like according to recipes selected by users in the same period and season in the past; video of the production of recipes, links to food material purchases, etc., as confirmed or selected by the user.
The following illustrates the implementation process of each module in different application scenarios:
1. intelligent anti-theft device
As shown in fig. 4, fig. 4 is a flowchart illustrating an embodiment of an intelligent anti-theft control method applied to the intelligent home robot shown in fig. 1, the intelligent anti-theft control method comprising the steps of:
and 110, when a user goes out, the robot receives each frame of picture containing the portrait, which is acquired in real time from the infrared camera shooting record by the video acquisition module, performs image segmentation, and then identifies the segmented image and compares the segmented image with the image existing in the database.
And 120, taking each acquired frame of picture as input of an image recognition algorithm, and calculating on a main control module.
And 130, if the calculation result finds that the facial features or the posture features of the portrait are inconsistent with all records stored in the biological feature database and the homeowner is not at home, initially identifying the human intruder, controlling the robot to broadcast ' please stop immediately ', otherwise, standing a horse to give an alarm ' voice, and simultaneously dialing a video call to the mobile terminal by the robot to enable the homeowner to carry out video conversation with the intruder.
2. Recipe recommendation
As shown in fig. 5, fig. 5 is a flowchart illustrating an embodiment of a recipe recommendation control method applied to the intelligent home robot shown in fig. 1, the recipe recommendation control method comprising the steps of:
step 210: the biological sensor obtains physical state indexes of a user, such as health indexes of heartbeat, electrocardio, skin electricity, blood sugar, blood oxygen and the like;
step 220: the system matches the common disease physical state index database in the database according to the obtained user health index, obtains N recipes recommended by the system according to the matching result (N is a natural number and is stored in the database as a configurable parameter), and records the recommended diet index corresponding to the user, the recipe record selected by the user and the user health file;
step 230: the physical condition index data of the user are recorded as follows:
A={a 1 ,a 2 …a n };
the user geographical location is recorded as follows:
l= { administrative division L 1 Average price of housing in district 2 Regional taste preference/ 3 …};
The user input diet preference F and the like construct user initial portrait data as follows:
D={d 1 ,d 2 ,…d m };
wherein:
D=A∪L∪F;
step 240: calculating the similarity of initial image data between users to obtain a similarity matrix S between the users;
Figure BDA0003314995200000101
wherein: s is(s) ij Representing the similarity of the initial image data of user i and user j. Calculating k similar user sets for user i such as:
U(i)={u r (i)},r=1,2,...k,k<n,u r (i)∈{s it t=1, 2, … n, and satisfies s i1 ≥s i2 ≥s i3 …≥s in
Step 250: analyzing the recipe record selected by the user i, putting an abnormal food list with very low user selection frequency (lower than a threshold V) into a set 0 (i), and putting a common menu with high user selection frequency (higher than the threshold V) into a set C (i); the union of all user j selection frequency high employing menu set C (j) similar to user i is as follows:
CR(i)=C(1)∪C(2)…∪C(k);
wherein j ε U (i);
and ordering from high to low according to the frequency; taking:
OR(i)=O(1)∪O(2)…∪O(k);
step 260: the system randomly extracts a recommended recipe from CR (i) according to the probability of 95%, and extracts the recommended recipe from OR (i) according to the probability of 5%, and recommends the recommended recipe to a user i; the user i selects a proper recipe from the recommended recipes, the system records the selection history of the user, and prompt information such as food processing and making notes is added according to the diet style preference recorded by the user;
step 270: the system mines trends in the user's selection of recipes from the recommended recipes, such as consumption levels, dietary preferences, etc., and adds adjustments to subsequent recommendations.
The recommendation algorithm may also be varied and adjusted when the patent is implemented, including but not limited to recommending other recipes based on the similarity of food description tags; recommending similar recipes and the like according to recipes selected by users in the same period and season in the past; video of the production of recipes, links to food material purchases, etc., as confirmed or selected by the user.
3. Travel mode
Referring to fig. 6, fig. 6 is a flowchart illustrating an embodiment of a travel mode control method applied to the intelligent home robot shown in fig. 1, the travel mode control method includes the following steps:
step S310, if the robot is recognized to go out of the home, the robot enters a travel mode. In the travel mode, the robot stands by in a standby state, if a control instruction for inspecting the home environment or opening/closing various devices sent by the mobile terminal is received, the current state of the environment or the devices is inspected by controlling the driving wheel to the corresponding position, and if the robot needs to be closed by hand, the mobile terminal selects the control button of the corresponding intelligent home appliance through controlling the mechanical arm and the driving wheel on the wheel disc interface or the device monitoring interface and performs remote real control by combining an infrared camera of the robot; after the user leaves home, the user can remotely check the conditions of the appointed area by means of a plurality of groups of sensor modules, and the intelligent household appliances in the remote control on/off house are realized by connecting the main control module with the execution module. For equipment or a switch which does not support the internet of things protocol, the operation can be completed by means of a mechanical arm; aiming at unknown scenes or too complex scenes, a user can remotely check videos through the APP and control the motor to move forwards and backwards and turn and the mechanical arm to finish remote control through the main control module.
Step S320, when the whole family goes out, the robot enters a travel mode, in the travel mode, when the family members go out, worry about whether an electric appliance main gate, an air conditioner, a window and the like of the house are closed, whether a water tap is closed, the express delivery does not arrive, and the like, the robot stands by in a standby state, the direction of a wheel disc interface and a mechanical arm button are controlled by a mobile end of a host or a control button of a corresponding intelligent household appliance is selected on a device monitoring interface to control a driving wheel to take a picture at a corresponding position, if the mobile member needs to be closed by hand, the host controls the mechanical arm at the mobile end, and remote real control is performed through an infrared camera of the robot; the method comprises the steps that firstly, a host sends the certificate of an express delivery person to a robot, then the robot takes the certificate of the express delivery person as input of an image recognition algorithm of a main control module to calculate whether the express delivery person is opened, if the image matching is successful, the express delivery person is opened, and if the image matching is unsuccessful, the express delivery person is not opened.
Scene four: remote monitoring
a. When the user gets up, the corresponding household appliances are remotely controlled through the mobile terminal or the window curtain is automatically opened and closed at regular time, the air conditioner is closed, and soft music is played.
b. When leaving home, the owner monitors the condition of the home at any time through the mobile terminal, and the robot monitors the security condition of the home, adjusts the humidity and sunlight of the home, receives express and the like.
c. When the user returns home, the hot water system and the lighting system are started in advance at the mobile end, the fingerprint is opened, and the dragging and sweeping integrated machine, the dish washing machine and the like start to work.
d. When sleeping, the curtain is pulled up and the air conditioner is turned on by turning off all the lamps.
Scene five: intelligent fireproof
When the gas sensor and the temperature sensor detect that the gas or carbon monoxide value or the temperature value exceeds an early warning threshold value, the robot starts to send an early warning to an owner and play prerecorded voice of 'possibly having a fire disaster and checking immediately', if the detected value exceeds the alarm threshold value, the robot starts to alarm and dial a video phone of a mobile terminal and broadcasts 'having a fire disaster, please process timely'.
Compared with the prior art, the intelligent household robot and the control method thereof adopt a mobile terminal, a cloud server, a robot, a control interface, a data database and intelligent household appliances, wherein the mobile terminal reads an electrical equipment state set uploaded by the robot in real time from a service terminal of the cloud server, issues an instruction to the robot through the cloud server, realizes remote monitoring and control on the states of houses and the intelligent household appliances, and displays the corresponding positions and real-time states of the intelligent household appliances on a house plan; the robot calls house and intelligent household electrical equipment information stored in the data database and remotely controls the intelligent household electrical equipment. The intelligent household robot and the control method thereof provided by the embodiment have various functions and are convenient to remotely control; the intelligent household appliance has the advantages that multiple functions are integrated, automatic operation and remote operation of intelligent household appliances and traditional non-intelligent household appliances, fireproof and antitheft automatic early warning and alarming, remote household inspection, accompanying auxiliary functions and the like can be realized, and multiple wireless control modes, multiple data feedback loops and image recognition and recipe pushing functions are introduced.
The foregoing describes in detail preferred embodiments of the present invention. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the invention by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.

Claims (8)

1. The intelligent household robot is characterized by comprising a first sensor module, a main control module and an executing mechanism, wherein the main control module is respectively connected with the first sensor module and the executing mechanism;
the first sensor module is used for detecting the content of gas components in the environment and sending the content of the components to the main control module;
the main control module is used for analyzing the content of the gas component, and when the content of the gas component exceeds a set threshold value, the main control module is connected with the Internet through the communication module to send alarm information to the APP of the mobile terminal;
the executing mechanism is used for running to a device generating the gas component in the environment under the control of the APP of the mobile terminal or the control of the main control module, and executing closing operation to control the device to stop generating the gas component;
the system also comprises a visible light and infrared video acquisition module, a face recognition module, a gesture recognition module and a communication module, wherein the visible light and infrared video acquisition module is respectively connected with the face recognition module and the gesture recognition module;
the visible light and infrared video acquisition module is used for acquiring video information of the environment where the robot is located;
the face recognition module is used for analyzing the collected video information and recognizing the face information in the video information;
the gesture recognition module is used for analyzing the collected video information and recognizing gesture information of the person in the video information;
the main control module is also used for outputting corresponding operation instructions according to the face information and the gesture information; the corresponding operation instructions comprise following, accompanying, supporting, closing and opening a mechanical part, a switch and an executing mechanism to make a call and trigger an alarm;
the system also comprises a second sensor module, a remote control module, a biological characteristic sensor, a built-in database, a display module, a loudspeaker and a positioning module which are connected with the main control module,
the second sensor module is used for detecting environmental temperature and humidity information in the house according to user setting or remote instructions;
the remote control module is used for sending a remote control command to the intelligent household appliance according to a control instruction of the mobile terminal APP or a decision result of the main control module;
the built-in database is used for storing various parameters, index data, state data and other data required by system operation set by a user;
the display module is used for completing the display and output of related information under the control of the main control module;
the loudspeaker is used for outputting voice information under the control of the main control module;
the positioning module is used for acquiring the position information of the robot under the control of the main control module;
the physical state index of the user is obtained through the biosensor, or the physical state index of the user input by the user is read and stored in the built-in database, the physical state index is used as the input of the algorithm by the system, and the personalized recipe is generated through the calculation of the algorithm and is pushed to the user, specifically:
step one: the biological sensor acquires the physical state index of the user;
step two: the system matches a common disease physical state index database in the database according to the acquired user health index, acquires N recipes recommended by the system according to the matching result, and records recommended diet indexes corresponding to the user, recipe records selected by the user and user health files;
step three: the physical condition index data of the user are recorded as follows: a is that
The user geographical location is recorded as follows: l= { administrative division L 1 Average price of housing in district 2 Regional taste preference/ 3 };
The user input diet preference F constructs user initial portrayal data as follows: d (D)
Wherein: d=ajjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjj-j-;
step four: calculating the similarity of the initial image data between users to obtain a similarity matrix between the users;
Figure QLYQS_1
wherein: s is S ij Representing the similarity of the initial image data of the user i and the user j, and calculating k similar user sets of the user i as follows:
U(i)={u r (i)},r=1,2,…k,k<n,u r (i)∈{s it },t=1,2,…n,
and satisfy s i1 ≥s i2 ≥s i3 …≥s in
Step five: analyzing the selected recipe record of the user i, putting the abnormal food list with the user selection frequency lower than the threshold value into a set O (i), and putting the common menu with the user selection frequency higher than the threshold value V into a set C (i); the union of all user j selection frequency high employing menu set C (j) similar to user i is as follows:
CR (i) =C (1)/(C2) … U.C (k), where j.epsilon.U (i),
and ordering from high to low according to the frequency; taking OR (i) =O (1)/(O) (2) …
Step six: the system randomly extracts a recommended recipe from CR (i) according to the probability of 95%, and extracts the recommended recipe from OR (i) according to the probability of 5%, and recommends the recommended recipe to a user i; the user i selects a proper recipe from the recommended recipes, the system records the selection history of the user, and prompt information such as food processing and making notes is added according to the diet style preference recorded by the user;
step seven: the system mines the tendency of the user to select a recipe from the recommended recipes and adds adjustments to the subsequent recommendations.
2. The intelligent home robot of claim 1, wherein the actuator comprises a drive wheel and a robotic arm;
the driving wheel is used for driving the robot to travel to a device generating the gas component in the environment;
and the mechanical arm is used for executing closing operation under the control of the APP of the mobile terminal or the control of the main control module so as to control the device to stop generating the gas component.
3. The intelligent home robot of claim 1, further comprising an audio acquisition module and a voice recognition module coupled to the audio acquisition module,
the audio acquisition module is used for acquiring audio data of the environment where the robot is located;
the voice recognition module is used for analyzing and recognizing the audio data.
4. The intelligent home robot of claim 1, further comprising a photosensitive sensor and a distance sensor connected to the master control module;
the photosensitive sensor is used for detecting indoor brightness;
the distance sensor is used for measuring the distance between the mechanical arm of the robot and the mechanical arm operation object and measuring the distance between the robot and a user following the accompanying person when the robot follows the accompanying person.
5. The intelligent home robot of claim 1, further comprising a lidar, a gyroscope and an accelerometer coupled to the master control module,
the laser radar is used for acquiring environment information of the robot, wherein the environment information comprises wall information, door and window information and barrier information in the advancing process;
the gyroscope is used for measuring the direction of the robot in the running process and sensing in the horizontal and vertical directions;
the accelerometer is used for measuring the inclination angle of the equipment relative to the horizontal plane and acquiring the acceleration values of the robot in the directions of x, y and z respectively.
6. The intelligent home robot of claim 1, wherein the remote control module comprises a bluetooth module, a radio frequency module, and an infrared emitting and receiving module;
the Bluetooth module is used for realizing wireless Bluetooth communication, positioning and control under the control of the main control module;
the radio frequency module is used for carrying out radio frequency communication with part of devices in the environment;
the infrared transmitting and receiving module is used for communicating with the intelligent household appliance under the control of the main control module.
7. A control method of an intelligent home robot applied to the intelligent home robot according to any one of claims 1 to 6, comprising the steps of:
the first sensor module detects the composition and content of the gas components in the environment and sends the composition and content to the main control module;
the main control module analyzes the composition and content of the gas components, and sends alarm information to the APP of the mobile terminal through the Internet under the condition that the content of the specific gas components exceeds a set threshold value, a user can operate on the mobile APP, the mobile APP sends a command to the robot through the Internet, and the execution mechanism is controlled to execute specific actions such as forward, backward, steering, movement of the mechanical arm, grasping and the like of the robot;
and the executing mechanism runs to a device generating the gas component in the environment under the control of the APP of the mobile terminal or the decision control of the main control module, and executes closing operation to control the device to stop generating the gas component.
8. The method of claim 7, wherein the intelligent home robot further comprises a built-in database, the method further comprising:
acquiring a user physical state index through a biosensor, or reading the user physical state index input by a user, and storing the user physical state index and the built-in database;
the system takes the physical state index as the input of an algorithm, generates a personalized recipe through the calculation of the algorithm, and pushes the personalized recipe to a user.
CN202111231155.2A 2021-10-21 2021-10-21 Intelligent household robot and control method Active CN114190823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111231155.2A CN114190823B (en) 2021-10-21 2021-10-21 Intelligent household robot and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111231155.2A CN114190823B (en) 2021-10-21 2021-10-21 Intelligent household robot and control method

Publications (2)

Publication Number Publication Date
CN114190823A CN114190823A (en) 2022-03-18
CN114190823B true CN114190823B (en) 2023-05-09

Family

ID=80646255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111231155.2A Active CN114190823B (en) 2021-10-21 2021-10-21 Intelligent household robot and control method

Country Status (1)

Country Link
CN (1) CN114190823B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115086645B (en) * 2022-06-10 2024-06-07 湖南师范大学 Panoramic video-oriented viewpoint prediction method, device and medium
CN115453900A (en) * 2022-08-24 2022-12-09 青岛海尔科技有限公司 Device control method, device, storage medium, and electronic apparatus

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101927492B (en) * 2010-06-23 2012-01-04 焦利民 Household intelligent robot system
KR102188090B1 (en) * 2013-12-11 2020-12-04 엘지전자 주식회사 A smart home appliance, a method for operating the same and a system for voice recognition using the same
CN106154982A (en) * 2014-12-12 2016-11-23 江苏美的清洁电器股份有限公司 Family's auxiliary robot and control method thereof and family's robotic system
CN204790566U (en) * 2015-07-16 2015-11-18 高世恒 Multi -functional intelligent house robot
WO2018018403A1 (en) * 2016-07-26 2018-02-01 深圳市赛亿科技开发有限公司 Housekeeping robot and control method
CN108839036A (en) * 2018-07-05 2018-11-20 四川长虹电器股份有限公司 Home intelligent health supervision robot
CN109036565A (en) * 2018-08-29 2018-12-18 上海常仁信息科技有限公司 A kind of wisdom family life management system based on robot

Also Published As

Publication number Publication date
CN114190823A (en) 2022-03-18

Similar Documents

Publication Publication Date Title
US20220247978A1 (en) Systems and Methods of Detecting and Responding to a Visitor to a Smart Home Environment
US11256908B2 (en) Systems and methods of detecting and responding to a visitor to a smart home environment
CN105446162B (en) A kind of intelligent home furnishing control method of smart home system and robot
CN114190823B (en) Intelligent household robot and control method
CN205334101U (en) Smart home system
US9614690B2 (en) Smart home automation systems and methods
US10986789B1 (en) System and method for sensor-assisted indoor gardening
WO2020253162A1 (en) Robot and control method therefor, and intelligent home control system
US10769909B1 (en) Using sensor data to detect events
US11972352B2 (en) Motion-based human video detection
US11676360B2 (en) Assisted creation of video rules via scene analysis
US10791607B1 (en) Configuring and controlling light emitters
US12014271B2 (en) Training image classifiers
US11734932B2 (en) State and event monitoring
US11550276B1 (en) Activity classification based on multi-sensor input
US20200224899A1 (en) Carbon monoxide purge system for a property
CA3104823C (en) Network activity validation
US20230252874A1 (en) Shadow-based fall detection
US20220222944A1 (en) Security camera drone base station detection
US20220222943A1 (en) Intelligent pausing of recording by a property monitoring system
CN114488879A (en) Robot control method and robot
US11521384B1 (en) Monitoring system integration with augmented reality devices
US20230394741A1 (en) Virtual reality assisted camera placement
US20230011337A1 (en) Progressive deep metric learning
CN115309833A (en) Display control method and device for area security situation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant