CN115167159A - Control method of intelligent nursing equipment, electronic equipment and control system - Google Patents

Control method of intelligent nursing equipment, electronic equipment and control system Download PDF

Info

Publication number
CN115167159A
CN115167159A CN202210706014.XA CN202210706014A CN115167159A CN 115167159 A CN115167159 A CN 115167159A CN 202210706014 A CN202210706014 A CN 202210706014A CN 115167159 A CN115167159 A CN 115167159A
Authority
CN
China
Prior art keywords
user
state
clothing
image
clothes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210706014.XA
Other languages
Chinese (zh)
Inventor
李鹏
胡蒙
张进叶
逯雁洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Midea Washing Machine Co Ltd
Original Assignee
Hubei Midea Washing Machine Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Midea Washing Machine Co Ltd filed Critical Hubei Midea Washing Machine Co Ltd
Priority to CN202210706014.XA priority Critical patent/CN115167159A/en
Publication of CN115167159A publication Critical patent/CN115167159A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/08Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The disclosure relates to the technical field of clothing care, and in particular to a control method, an electronic device and a control system for intelligent care equipment. The control method of the intelligent nursing equipment comprises the steps of obtaining a user image; acquiring the state of the clothing worn by the user according to the user image; and controlling the intelligent nursing equipment to send out nursing prompts according to the state of the clothes. The technical scheme realizes that the reminding of sending the clothing nursing to the user is triggered according to the detection result of the clothing state, effectively solves the problems that the user often forgets to nurse clothing and does not know when to nurse clothing, improves the intelligent degree of nursing equipment, optimizes the comfort level of wearing clothing by the user, and is beneficial to prolonging the wearing life of clothing.

Description

Control method of intelligent nursing equipment, electronic equipment and control system
Technical Field
The disclosure relates to the technical field of clothing care, and in particular to a control method, an electronic device and a control system for intelligent care equipment.
Background
With the rapid development of economy and the continuous improvement of the living standard of users, the clothes are used as necessities in daily life, and the nursing problem of the clothes is more and more concerned by the users.
Clothing care is a great demand of daily life of many users, but most users often forget to care clothing, do not know when to care clothing, and do not know how to care clothing, which affects the wearing comfort of clothing, causes the poor wearing state of the clothing for the users, and further affects the long-term use of the clothing.
Disclosure of Invention
In order to solve the technical problem, the disclosure provides a control method, an electronic device and a control system of an intelligent nursing device, which realize that a reminder for clothes nursing is sent to a user according to the detection result of the clothes state, effectively solve the problems that the user often forgets to nurse clothes and does not know when to nurse clothes, improve the intelligent degree of the nursing device, optimize the comfort level of wearing clothes of the user, and are beneficial to prolonging the wearing life of the clothes.
In a first aspect, an embodiment of the present disclosure provides a control method for an intelligent care device, including:
acquiring a user image;
acquiring the state of the clothing worn by the user according to the user image;
and controlling the intelligent nursing equipment to send out nursing prompts according to the state of the clothes.
In some embodiments, the obtaining of the state of the apparel worn by the user from the user image includes:
acquiring the state of the clothing according to the first user image and the second user image;
the first user image is the user image when the user goes out, and the second user image is the user image when the user enters.
In some embodiments, the acquiring the state of the apparel from the first user image and the second user image comprises at least one of the following processes:
comparing the clothing appearance information in the first user image with the clothing appearance information in the second user image to obtain the damaged state of the clothing;
comparing the clothing color shade information in the first user image with the clothing color shade information in the second user image to obtain the moisture state of the clothing;
comparing the clothing color change information in the first user image with the clothing color change information in the second user image to obtain the stain state of the clothing surface;
acquiring the dress wearing time according to the acquisition time node of the first user image and the acquisition time node of the second user image, and acquiring the peculiar smell state of the dress according to the dress wearing time.
In some embodiments, the obtaining, from the user image, a state of a garment worn by the user comprises:
acquiring user sweating information in the user image;
and acquiring the peculiar smell state of the clothes according to the user sweating information.
In some embodiments, the control method of the smart care device further includes:
acquiring outdoor temperature information;
and acquiring the peculiar smell state of the clothes according to the outdoor temperature information, and controlling the intelligent nursing equipment to perform clothes processing prompt to a user according to the peculiar smell state of the clothes.
In some embodiments, before the acquiring the state of the clothing worn by the user according to the user image, the method further includes:
acquiring user identity information according to the user image;
and when the user identity information is preset user identity information, acquiring the state of the clothing worn by the user according to the user image.
In a second aspect, an embodiment of the present disclosure further provides another control method of an intelligent care device, including:
acquiring a prompt control signal sent by image acquisition equipment; the prompt control signal is a signal generated by the image acquisition equipment according to the state of a dress worn by a user, and the state of the dress is acquired by the image acquisition equipment according to an acquired user image;
and controlling the intelligent nursing equipment to send nursing prompts according to the prompt control signals.
In some embodiments, said controlling said smart care device to issue a care prompt according to said prompt control signal comprises:
controlling the intelligent nursing equipment to directly send nursing prompts to users according to the prompt control signals; or controlling the intelligent nursing equipment to send nursing prompts to the user through a preset mobile terminal according to the prompt control signal.
In some embodiments, the method further comprises:
controlling the intelligent nursing equipment to be automatically started;
and controlling the intelligent nursing equipment to generate an optimal nursing program according to the prompt control signal.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including:
a processor and a memory, the processor executing the steps of the control method of any one of the intelligent care devices as provided by the first aspect, or executing the steps of the control method of any one of the intelligent care devices as provided by the second aspect, by calling a program or instructions stored in the memory.
In a fourth aspect, an embodiment of the present disclosure further provides a control system, including:
an image acquisition device and a smart care device, the image acquisition device and the smart care device being in communication connection, the image acquisition device being configured to perform the steps of the method for controlling any one of the smart care devices as provided by the first aspect, and the smart care device being configured to perform the steps of the method for controlling any one of the smart care devices as provided by the second aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
according to the control method of the intelligent nursing equipment, the user image is acquired, the state of the clothes worn by the user can be acquired based on the user image, whether the clothes need to be nursed or not is judged according to the state of the clothes, and when the clothes need to be nursed, the intelligent nursing equipment is controlled to send out a nursing prompt to the user. Therefore, the reminding of sending the clothing nursing to the user is triggered according to the detection result of the clothing state, the problems that the user often forgets to nurse clothing and does not know when to nurse clothing are effectively solved, the intelligent degree of nursing equipment is improved, the comfort degree of wearing the clothing by the user is optimized, and the wearing life of the clothing is prolonged.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a control method of an intelligent care device according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a control device of an intelligent care apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another control method of an intelligent care device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a control device of another intelligent care device according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a control method of an intelligent care device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a control system according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
The following describes an example of a control method, an electronic device, and a control system of an intelligent care device according to an embodiment of the present disclosure with reference to the drawings.
Fig. 1 is a schematic flowchart of a control method of an intelligent care device according to an embodiment of the present disclosure. The method is suitable for application scenes needing to control the nursing equipment. The method may be performed by a control device in linkage with the intelligent nursing device provided in the embodiments of the present disclosure, for example, the control device may be a controller in the image capturing device, and the control device of the intelligent nursing device may be implemented in a software and/or hardware manner. As shown in fig. 1, the method comprises the steps of:
and S101, acquiring a user image.
Specifically, by setting an image capture device such as, but not limited to, a camera, a user is photographed by the camera to obtain a user image. The camera can be always in an open state, images are shot in real time, and user images are obtained according to the shot images; the camera can also be started at preset interval time to shoot images, and user images are obtained according to the shot images. Therefore, the user image can be acquired by arranging the image acquisition equipment.
In other embodiments, the user image may be obtained by a technique known to those skilled in the art, and is not limited in detail herein.
And S102, acquiring the state of the clothes worn by the user according to the user image.
Specifically, the user image can be shot through the camera, the user image comprises an image of a garment worn by the user, and then the state information of the garment worn by the user can be obtained. Therefore, the specific state of the clothes worn by the user can be acquired according to the user image.
In some embodiments, obtaining a state of apparel worn by the user from the user image includes:
acquiring the state of the clothing according to the first user image and the second user image;
the first user image is a user image when the user leaves the door, and the second user image is a user image when the user enters the door.
Specifically, when the user leaves, a camera arranged outside the door can be used for collecting the user image, the user image at the moment is a first user image, the first user image information comprises the state information of the clothing when the user leaves, and then the specific state of the clothing when the user leaves can be obtained based on the first user image.
When the user returns home to open the door, a camera arranged outside the door can be used for collecting the user image, the user image at the moment is a second user image, the second user image information comprises state information of the clothes when the user returns home to open the door, and then the specific state of the clothes when the user returns home to open the door can be obtained based on the second user image.
It should be noted that the clothing may be clothing worn by the user or shoes worn by the user, and when the clothing is shoes, the state of the shoes worn by the user is acquired according to the user image, and it is preferable to acquire a large-area image of the shoes worn by the user, for example, a large-area image including the toe cap. When the user leaves or the user goes home and opens the door, if the large-area image of the shoes worn by the user can not be collected only by the camera arranged outside the door, the camera can be arranged in the door, the camera arranged outside the door is used for collecting the image of the shoes worn by the user when the user goes home and opens the door, the camera arranged in the door is used for collecting the image of the large area worn by the user when the user leaves, therefore, the camera arranged outside the door and the camera arranged in the door are combined, the large-area image of the shoes worn by the user is collected, the state of the shoes worn by the user is obtained, and the accuracy of shoe state judgment is improved.
It should be noted that, according to the collected user image, it can be analyzed that the user is currently in a state of going out and leaving or in a state of going home and opening the door. When the user leaves or goes home to open the door, the user image collected by the image collecting device is a group of continuously shot images, and the user can be determined to be in a state of leaving or opening the door by analyzing the collected front image information or back image information of the user. For example, when the user leaves, the front of the user faces the direction outside the door, and the back of the user faces the direction inside the door; when the user goes home to open the door, the front of the user faces the direction inside the door, and the back of the user faces the direction outside the door, so that the obtained user image can be distinguished as the user image when the user goes out or the user image when the user enters the door. In addition, when the camera outside the door can collect the continuous images of the user action, when the clothes are shoes, the user can leave and leave the door and the user can return home to open the door by only depending on the camera outside the door, and the images of the large-area shoes are included.
In some embodiments, acquiring the state of the apparel from the first user image and the second user image comprises at least one of:
comparing the clothing appearance information in the first user image with the clothing appearance information in the second user image to obtain the damaged state of the clothing;
comparing the clothing color shade information in the first user image with the clothing color shade information in the second user image to obtain the moisture state of the clothing;
comparing the clothing color change information in the first user image with the clothing color change information in the second user image to obtain the stain state of the clothing surface;
and acquiring the wearing time of the clothes according to the acquisition time node of the first user image and the acquisition time node of the second user image, and acquiring the peculiar smell state of the clothes according to the wearing time of the clothes.
Specifically, the first user image is an image shot when the user leaves, and the appearance state of the user clothes at the moment can be acquired based on the image shot when the user leaves; the second user image is an image shot when the user returns home to open the door, and the appearance state of the user clothes can be obtained based on the image shot when the user returns home to open the door; the appearance state of the clothing when the user leaves and the appearance state of the clothing when the user returns home and opens the door are compared, and then whether the clothing appearance of the user is damaged and deformed is judged, namely the damaged state of the clothing of the user is obtained. For example, in the case of a shoe, the second user image is compared with the first user image to determine that the shape of a certain area of the shoe has changed significantly, such as a recess, and it can be determined that the shoe of the user is damaged and deformed. Taking the clothes as the clothes to be worn as an example, the second user image is found by comparison with the first user image, and the appearance of a certain area of the clothes is judged to be obviously changed, for example, a new opening appears on the clothes, and at the moment, the clothes of the user can be judged to be damaged.
Specifically, the color depth state of the clothing when the user leaves and the color depth state of the clothing when the user returns home to open the door are obtained, the color depth state of the clothing when the user leaves and the color depth state of the clothing when the user returns home to open the door are compared, and then whether the user wears the clothing is wet or not is judged, namely the wet state of the clothing of the user is obtained. For example, when the user goes out and encounters rain, the clothing is darker when encountering water, and the clothing is darker when being compared with the first user image, and the clothing is darker when being judged by comparison, so that the user wearing the clothing can be judged to be in a wet state.
Specifically, the color change state of the clothes when the user leaves and the color change state of the clothes when the user returns home to open the door are obtained, the color change state of the clothes when the user leaves and the color change state of the clothes when the user returns home to open the door are compared, and then whether stains exist on the surface of the clothes worn by the user is judged, namely the stain state of the clothes of the user is obtained. For example, the second user image is compared with the first user image, and if the color of the surface of a certain area of the clothing is obviously changed, the existence of the stains on the surface of the clothing can be judged.
Specifically, the moment when the user leaves is used as a first time node, the moment when the user leaves is used as a second time node, the time when the user leaves the home and opens the door is used as a user image, the time when the user wears clothes after leaving the door can be judged according to the first time node and the second time node, the peculiar smell degree of the user clothes can be judged according to the wearing time, and the peculiar smell state of the clothes is obtained. The longer the dress is worn, the more the peculiar smell of the dress is.
In some embodiments, obtaining a state of apparel worn by a user from an image of the user comprises:
acquiring user sweating information in a user image;
and acquiring the peculiar smell state of the clothes according to the sweating information of the user.
In particular, if the temperature is too high on the day or the user performs outdoor sports, the user is apt to sweat. When a user sweats, the clothes are easy to generate peculiar smell. Based on this, preferably when the user goes home and opens the door, whether the user perspires and the perspire degree is judged according to the user image of shooing, acquires user's perspire information promptly, and then judges the peculiar smell degree of user's dress according to user's perspire information, acquires the peculiar smell state of dress promptly.
Exemplarily, according to a shot user image when the user returns home and opens the door, if the sweat amount of the user is identified to be large, the user can be determined to have peculiar smell and large peculiar smell when wearing clothes; if the user is identified to have less sweat, the peculiar smell of the user wearing clothes can be determined to be less.
In some embodiments, the control method of the smart care device further comprises:
acquiring outdoor temperature information;
and acquiring the peculiar smell state of the clothes according to the outdoor temperature information, and controlling the intelligent nursing equipment to perform clothes processing prompt to the user according to the peculiar smell state of the clothes.
Specifically, the temperature of the outdoor environment may be acquired by the image acquisition device, for example, the image acquisition device may perform interactive communication with a mobile terminal of a user, for example, a mobile phone, and the image acquisition device may acquire the temperature information of the outdoor environment by acquiring the temperature displayed by an application program of the mobile phone of the user. The odor degree of the clothes worn by the user can be judged through the acquired outdoor environment temperature, for example, the outdoor environment temperature is high, and the odor of the clothes worn by the user can be judged to be large; the outdoor environment temperature is low, and the peculiar smell of the user wearing the clothes can be judged to be small.
Therefore, the odor degree of the clothes can be judged according to the temperature information of the outdoor environment, when the odor of the clothes of the user is judged to be large and nursing is needed, the image acquisition equipment and the mobile terminal of the user such as a mobile phone are in interactive communication, and the user is reminded of removing the odor of the clothes worn through the mobile terminal. After the user receives the reminding information, the user can select to immediately or at a proper time according to the requirement to perform odor removing treatment on the clothes by using the intelligent nursing device, for example, the user can set other time points to control the intelligent nursing device to perform odor removing reminding again. Similarly, the aforementioned interaction with the terminal to realize the clothing care reminding is also applicable to stain removal treatment, drying treatment and shoe shape recovery treatment, and is not described herein again.
In the above embodiment, whether the clothing has the peculiar smell or not can be judged by acquiring the sweating information of the user, whether the clothing has the peculiar smell or not can be judged by acquiring the wearing time of the user, whether the clothing has the peculiar smell or not can be judged by acquiring the outdoor temperature information, or the peculiar smell of the clothing can be judged by using any two or three information of the sweating information, the wearing time and the outdoor temperature, so that the judgment accuracy of the peculiar smell of the clothing can be improved. It should be noted that, it is the most preferable implementation manner of the embodiments of the present disclosure to determine whether or not there is a bad smell in the clothes by acquiring the sweat information of the user.
S103, controlling the intelligent nursing equipment to send out nursing prompts according to the state of the clothes.
Specifically, whether the clothing of the user needs to be nursed or not can be judged based on the acquired state information of the clothing of the user, and if the clothing of the user needs to be nursed, the intelligent nursing equipment is controlled to remind the user.
Optionally, when the intelligent nursing device is controlled to issue a nursing prompt, the method further includes: controlling the intelligent nursing equipment to automatically start up; and controlling the intelligent nursing equipment to generate an optimal nursing program according to the prompt control signal. Furthermore, the user can put the clothes into the intelligent nursing equipment according to the prompt, the intelligent nursing equipment is automatically started, and an optimal nursing program is generated according to the state of the clothes of the user, so that the clothes are nursed.
Illustratively, if the clothes worn by the user are damaged, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started up, and a corresponding optimal nursing program is generated according to the damaged state of the clothes of the user, so that the intelligent nursing device can repair the damaged deformation of the clothes of the user.
Illustratively, if the user wears the clothes in a wet state, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the wet state of the user clothes, so that the intelligent nursing device can dry the clothes of the user.
Illustratively, if stains exist on the surface of the user wearing clothes, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the surface stain state of the user clothes, so that the intelligent nursing device can clean the stains on the user clothes.
Illustratively, if the user wears clothes and has peculiar smell, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the peculiar smell state of the user clothes, so that the intelligent nursing device can remove the peculiar smell of the user clothes.
It should be noted that, when it is determined that the clothing worn by the user is damaged, in a wet state, and has stains and a peculiar smell on the surface based on the state information of the clothing of the user, the intelligent nursing device may perform the priority of clothing processing on the clothing, such as performing the drying processing and the peculiar smell removing processing on the clothing preferentially, then performing the stain cleaning processing, and finally performing the damaged deformation repairing processing. For example, the intelligent nursing device sends a voice prompt to the user to prompt the user to perform drying treatment and peculiar smell removal treatment at present; after drying and peculiar smell removing of the clothes are finished, sending voice prompt to a user to prompt the user to carry out stain cleaning currently; and after the stain cleaning treatment of the clothes is finished, sending a voice prompt to the user to prompt the user to carry out the damaged and deformed repair treatment at present. It should be noted that, the embodiment of the present disclosure does not limit the specific manner in which the smart care device prompts the user for the clothing care.
In some embodiments, before acquiring the state of the clothing worn by the user according to the user image, the method further includes:
acquiring user identity information according to the user image;
and when the user identity information is preset user identity information, acquiring the state of the clothing worn by the user according to the user image.
Specifically, the identity information of the user can be identified according to the user image, for example, different users have different head portrait information, and the identity information of the user can be confirmed based on the head portrait information, so that the user is confirmed to be a standing user or a visiting guest based on the identity information of the user. In order to distinguish the standing user from the visiting guest, the information of the standing user can be set as the preset user identity information, for example, the head portrait information of the standing user is stored in a processor of the camera, and when the camera acquires the head portrait information of the user, whether the current user is the standing user can be judged. When the user is judged to be the standing user, the whole-body image information including the clothing of the user can be acquired, and then the specific state of the clothing worn by the user is acquired according to the image information, so that timely and intelligent reminding of clothing nursing is realized; when the user is judged not to be a standing user, such as a visitor, intelligent reminding of dress care is not triggered.
According to the control method of the intelligent nursing equipment provided by the embodiment of the disclosure, the user image when the user leaves can be obtained, and the state of the clothes worn by the user when the user leaves can be obtained according to the user image; the state of the clothing worn by the user when the user goes home and enters the door can be obtained according to the user image by obtaining the user image when the user goes home and enters the door; and then through comparing the state of the dress that the user wore when the user leaves and the state of the dress that the user wore when the user goes home and enters, when judging the dress that needs to wear to the user and nursing, control intelligent nursing equipment and remind the user, after the user puts into equipment with the dress, intelligent nursing equipment is automatic to start, and generate best nursing procedure according to the state condition of dress and nurse the dress, realized triggering to send the warning of dress nursing to the user according to the testing result of dress state, effectively solved the problem that the user often forgets nursing dress and does not know when nursing dress, the intelligent degree of nursing equipment has been improved, the comfort level of user dress has been optimized, be favorable to prolonging the dress life-span.
Based on the same inventive concept, the embodiment of the disclosure also provides a control device of the intelligent nursing equipment. Fig. 2 is a schematic structural diagram of a control apparatus of an intelligent care device according to an embodiment of the present disclosure, and as shown in fig. 2, the apparatus includes:
an obtaining module 21, configured to obtain a user image; acquiring the state of the clothing worn by the user according to the user image; and the control module 22 is used for controlling the intelligent nursing equipment to send out nursing prompts according to the state of the clothes.
The control device of the intelligent nursing device provided in the foregoing embodiments can execute the control method of the intelligent nursing device provided in the foregoing embodiments, and has the same or corresponding beneficial effects, which are not described in detail herein.
The embodiments of the present disclosure also provide a storage medium, which stores a program or instructions, where the program or instructions make a computer execute the steps of any one of the methods for controlling an intelligent care device provided in the foregoing embodiments.
In some embodiments, the computer-executable instructions, when executed by a computer processor, may also be used to implement any of the above-described methods provided by the embodiments of the present disclosure, to achieve corresponding advantages.
The embodiment of the disclosure also provides another control method of the intelligent nursing equipment. Fig. 3 is a schematic flowchart of another control method of an intelligent care device according to an embodiment of the present disclosure. The method is suitable for application scenes needing to control the intelligent nursing equipment. The method can be executed by the intelligent nursing device control device provided by the embodiment of the disclosure, for example, the intelligent nursing device control device can be executed by a controller in the intelligent nursing device, and the intelligent nursing device control device can be implemented in a software and/or hardware manner. As shown in fig. 3, the method comprises the steps of:
s301, acquiring a prompt control signal sent by image acquisition equipment; the prompt control signal is a signal generated by the image acquisition equipment according to the state of the clothes worn by the user, and the state of the clothes is acquired by the image acquisition equipment according to the acquired user image.
Specifically, by setting an image capture device such as, but not limited to, a camera, a user is photographed by the camera to obtain a user image. The image acquisition device generates a prompt control signal of a set state and sends the prompt control signal to the intelligent nursing device, for example, a high-level prompt control signal, and accordingly, the intelligent nursing device acquires the corresponding prompt control signal.
And S302, controlling the intelligent nursing equipment to send out nursing prompts according to the prompt control signals.
Specifically, after the intelligent nursing device receives a prompt control signal which corresponds to a prompt that the user needs to be nursed with dress care, the intelligent nursing device is controlled to remind the user of the dress care, the user is prompted to nurse the dress, furthermore, the user can put the dress into the intelligent nursing device according to the prompt, the intelligent nursing device is automatically started, and an optimal nursing program is generated according to the state of the dress of the user, so that the dress is nursed.
Illustratively, if the clothes worn by the user are damaged, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started up, and a corresponding optimal nursing program is generated according to the damaged state of the clothes of the user, so that the intelligent nursing device can repair the damaged deformation of the clothes of the user.
Illustratively, if the user wears the clothes in a wet state, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the wet state of the user clothes, so that the intelligent nursing device can dry the clothes of the user.
Illustratively, if stains exist on the surface of the clothes worn by the user, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the surface stain state of the user clothes, so that the intelligent nursing device can clean the stains on the clothes of the user.
Illustratively, if the user wears clothes and has peculiar smell, the intelligent nursing device is controlled to remind the user, after the user receives the reminding, the clothes are put into the intelligent nursing device, then the intelligent nursing device is automatically started, and a corresponding optimal nursing program is generated according to the peculiar smell state of the user clothes, so that the intelligent nursing device can remove the peculiar smell of the user clothes.
In some embodiments, controlling the smart care device to perform a clothing treatment prompt to the user according to the prompt control signal includes:
controlling the intelligent nursing equipment to directly perform clothing processing prompt to the user according to the prompt control signal; or, the intelligent nursing equipment is controlled to carry out clothing processing prompt on the user through a preset mobile terminal according to the prompt control signal.
Specifically, after the intelligent nursing device receives a prompt control signal corresponding to a prompt of nursing clothing to a user, the clothing of the user needs to be nursed, at the moment, the intelligent nursing device can directly send voice prompt information or display prompt information or send the voice prompt information and the display prompt information at the same time, for example, the voice prompt information is sent, then the clothing worn by the user is reminded through the voice information to be nursed, and after the user receives the voice prompt, the clothing is put into the intelligent nursing device, so that the intelligent nursing device nurses the clothing of the user.
Or the intelligent nursing device is in communication connection with a mobile terminal of the user, the user is reminded of needing to nurse the clothing worn by the user through the mobile terminal such as a mobile phone, and after the user receives the reminder from the mobile terminal, the clothing is put into the intelligent nursing device, so that the intelligent nursing device nurses the clothing of the user.
It should be noted that, after the user receives the reminder, if the current user is in a busy state and does not have time to process the clothing, the intelligent nursing device interacts with the mobile terminal of the user, and the intelligent nursing device reminds the user again within the preset time by setting the application program of the mobile terminal according to the requirements of the user.
According to the control method of the intelligent nursing equipment provided by the embodiment of the disclosure, the user image when the user leaves is obtained, and the state of the clothing worn by the user when the user leaves can be obtained according to the user image; the state of the clothing worn by the user when the user goes home and enters the door can be obtained according to the user image by obtaining the user image when the user goes home and enters the door; and then through comparing the state of the dress that the user wore when the user leaves and the state of the dress that the user wore when the user goes home and enters, generate the suggestion control signal when judging the dress that needs to wear to the user and nurse, control intelligent nursing equipment based on this suggestion control signal and remind the user, after the user puts the dress into equipment, intelligent nursing equipment automatic start-up, and generate best care program according to the state condition of dress and nurse the dress, realized triggering to send the warning of dress nursing to the user according to the testing result of dress state, effectively solved the problem that the user often forgets the dress nursing and does not know when to nurse the dress, the intelligent degree of nursing equipment has been improved, optimized the comfort level of the dress that the user wore, be favorable to prolonging the dress life-span.
Based on the same inventive concept, the embodiment of the disclosure also provides another control device of the intelligent nursing equipment. Fig. 4 is a schematic structural diagram of another control apparatus of an intelligent care device according to an embodiment of the present disclosure, and as shown in fig. 4, the apparatus includes:
a control signal obtaining module 41, configured to obtain a prompt control signal sent by the image acquisition device; the prompting control signal is a signal generated by the image acquisition equipment according to the state of the clothes worn by the user, and the state of the clothes is acquired by the image acquisition equipment according to the acquired user image; and the first control module 42 is used for controlling the intelligent nursing device to send out nursing prompts according to the prompt control signals.
The other control device provided in the foregoing embodiment can execute the other control method provided in each of the foregoing embodiments, and has the same or corresponding beneficial effects, which are not described in detail herein.
The embodiments of the present disclosure also provide another storage medium, which stores a program or instructions that cause a computer to execute the steps of another control method provided in the above-described embodiments.
From the above description of the embodiments, it is obvious for a person skilled in the art that the present disclosure can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but in many cases, the former is a better embodiment. Based on such understanding, the technical solutions of the present disclosure may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods of the embodiments of the present disclosure.
As an example, taking a dress of a user as a shoe to be worn, an image acquisition device as an outdoor camera, and an intelligent nursing device as an example, the control method of the intelligent nursing device disclosed by the application is as shown in fig. 5:
s501, the user leaves the door, and the camera outside the door collects the first user image.
S502, the user returns home to open the door, and the camera outside the door collects a second user image.
S503, the camera outside the door acquires first surface state information of the shoes worn by the user in the first user image and second surface state information of the shoes worn by the user in the second user image.
S504, the outdoor camera compares the first surface state information with the second surface state information to determine the damaged state, the wet state and the dirty state of the shoe.
And S505, the outdoor camera acquires at least one of outdoor temperature information, user sweating information in the second user image and wearing time of the user wearing the shoes, and determines whether the shoes generate peculiar smell.
S506, the camera outside the door sends a prompt control signal to the intelligent shoe box.
And S507, the intelligent shoe box controls the intelligent shoe box to automatically start up, generate an optimal nursing program and send a nursing prompt according to the prompt control signal.
It should be added that after the outdoor camera acquires the first user image and the second user image, the first user image and the second user image can be uploaded to the server, the server performs comparative analysis on the image information, and finally the server feeds back the result of the comparative analysis to the outdoor camera or the intelligent shoe box. The camera outside the door is communicated with the intelligent shoe box through wireless network connection.
On the basis of the foregoing implementation manner, an embodiment of the present disclosure further provides an electronic device, and fig. 6 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure. As shown in fig. 6, the image capturing device includes a processor 601 and a memory 602, where the processor 601 executes the steps of the method provided in fig. 1 by calling a program or an instruction stored in the memory, and at this time, the electronic device may be built in the image capturing device, or the steps of the method provided in fig. 3 may be executed, and at this time, the electronic device may be built in the smart care device, so that the beneficial effects of the foregoing embodiments are achieved, and details are not repeated here.
As shown in fig. 6, the appliance may be arranged to comprise at least one processor 601, at least one memory 602 and at least one communication interface 603. The various components in the appliance are coupled together by a bus system 604. The communication interface 603 is used for information transmission with an external device. It is understood that the bus system 604 is used to enable communications among the components. The bus system 604 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for the sake of clarity the various busses are labeled in fig. 6 as the bus system 604.
It will be appreciated that the memory 602 in this embodiment can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. In some embodiments, memory 602 stores the following elements: an executable unit or data structure, or a subset thereof, or an extended set of them, an operating system and an application program. In the embodiment of the present invention, the processor 601 executes the steps of the embodiments of the method provided by the embodiments of the present disclosure by calling the program or the instructions stored in the memory 602.
The method provided by the embodiment of the present disclosure may be applied to the processor 601, or implemented by the processor 601. The processor 601 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 601. The Processor 601 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method provided by the embodiment of the present disclosure may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software units in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 602, and the processor 601 reads the information in the memory 602, and performs the steps of the method in combination with the hardware.
The appliance may also include a physical component, or multiple physical components, to carry out instructions generated by the processor 601 when carrying out the methods provided by the embodiments of the present disclosure. Different physical components can be arranged in the electric appliance or outside the electric appliance, such as a cloud server and the like. The various physical components cooperate with the processor 601 and memory 602 to implement the functions of the appliance in this embodiment.
The embodiment of the disclosure also provides a control system. Fig. 7 is a schematic structural diagram of a control system according to an embodiment of the present disclosure, and as shown in fig. 6, the control system includes: the image acquisition device 71 and the intelligent nursing device 72, the image acquisition device 71 and the intelligent nursing device 72 are in communication connection, the image acquisition device 71 is configured to execute the steps of the method shown in fig. 1, and the intelligent nursing device 72 is configured to execute the steps of the method shown in fig. 3, so that the beneficial effects described in the above embodiment are achieved, and details are not repeated here. Illustratively, the image capturing device includes, but is not limited to, a camera, which is preferably arranged to be installed outside a door of a household. It should be noted that, the communication mode between the image capturing device and the intelligent nursing device is not specifically limited in the embodiments of the present disclosure, and the image capturing device and the intelligent nursing device may be sold in a set or may be sold separately.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. A control method of an intelligent nursing device is characterized by comprising the following steps:
acquiring a user image;
acquiring the state of the clothing worn by the user according to the user image;
and controlling the intelligent nursing equipment to send out nursing prompts according to the state of the clothes.
2. The control method of the intelligent nursing device according to claim 1, wherein the obtaining the state of the clothing worn by the user according to the user image comprises:
acquiring the state of the clothing according to the first user image and the second user image;
the first user image is the user image when the user goes out, and the second user image is the user image when the user enters.
3. The method for controlling a smart nursing device according to claim 2, wherein the obtaining the state of the garment according to the first user image and the second user image comprises at least one of the following processes:
comparing the clothing appearance information in the first user image with the clothing appearance information in the second user image to obtain the damaged state of the clothing;
comparing the clothes color shade information in the first user image with the clothes color shade information in the second user image to obtain the wet state of the clothes;
comparing the clothing color change information in the first user image with the clothing color change information in the second user image to obtain the stain state of the clothing surface;
and acquiring the wearing time of the clothes according to the acquisition time node of the first user image and the acquisition time node of the second user image, and acquiring the peculiar smell state of the clothes according to the wearing time of the clothes.
4. The control method of the intelligent nursing device according to claim 1, wherein the obtaining the state of the clothing worn by the user according to the user image comprises:
acquiring user sweating information in the user image;
and acquiring the peculiar smell state of the clothes according to the user sweating information.
5. The control method of an intelligent care device according to claim 1, further comprising:
acquiring outdoor temperature information;
and acquiring the peculiar smell state of the clothes according to the outdoor temperature information, and controlling the intelligent nursing equipment to send a nursing prompt according to the peculiar smell state of the clothes.
6. The control method of the intelligent nursing device according to claim 1, before the obtaining the state of the clothing worn by the user according to the user image, further comprising:
acquiring user identity information according to the user image;
and when the user identity information is preset user identity information, acquiring the state of the clothing worn by the user according to the user image.
7. A control method of an intelligent nursing device is characterized by comprising the following steps:
acquiring a prompt control signal sent by image acquisition equipment; the prompt control signal is a signal generated by the image acquisition equipment according to the state of a dress worn by a user, and the state of the dress is acquired by the image acquisition equipment according to the acquired user image;
and controlling the intelligent nursing equipment to send nursing prompts according to the prompt control signals.
8. The method for controlling the intelligent nursing device according to claim 7, wherein the controlling the intelligent nursing device to issue nursing prompts according to the prompt control signals comprises:
controlling the intelligent nursing equipment to directly send nursing prompts to users according to the prompt control signals; or controlling the intelligent nursing equipment to send nursing prompts to the user through a preset mobile terminal according to the prompt control signal.
9. The method of controlling a smart care device according to claim 7, the method further comprising:
controlling the intelligent nursing equipment to be automatically started;
and controlling the intelligent nursing equipment to generate an optimal nursing program according to the prompt control signal.
10. An electronic device, comprising:
a processor and a memory, the processor performing the steps of the method of controlling a smart care device according to any one of claims 1-6 or performing the steps of the method of controlling a smart care device according to any one of claims 7-9 by calling a program or instructions stored in the memory.
11. A control system, comprising:
an image capturing device and a smart care device, the image capturing device and the smart care device being communicatively connected, the image capturing device being configured to perform the steps of the method for controlling a smart care device according to any of claims 1-6, the smart care device being configured to perform the steps of the method for controlling a smart care device according to any of claims 7-9.
CN202210706014.XA 2022-06-21 2022-06-21 Control method of intelligent nursing equipment, electronic equipment and control system Pending CN115167159A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210706014.XA CN115167159A (en) 2022-06-21 2022-06-21 Control method of intelligent nursing equipment, electronic equipment and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210706014.XA CN115167159A (en) 2022-06-21 2022-06-21 Control method of intelligent nursing equipment, electronic equipment and control system

Publications (1)

Publication Number Publication Date
CN115167159A true CN115167159A (en) 2022-10-11

Family

ID=83487278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210706014.XA Pending CN115167159A (en) 2022-06-21 2022-06-21 Control method of intelligent nursing equipment, electronic equipment and control system

Country Status (1)

Country Link
CN (1) CN115167159A (en)

Similar Documents

Publication Publication Date Title
CN106854808B (en) Washing machine and washing control method and device thereof
CN108181819B (en) Linkage control method, device and system for household electrical appliance and household electrical appliance
CN107904860B (en) Method and device for treating clothes in washing machine
CN109112774B (en) Control method and device of washing machine, storage medium and washing machine
CN110592881A (en) Clothes washing control method and device based on Internet of things operating system
CN113819620B (en) Method and device for controlling air conditioner, air conditioner and storage medium
CN112941804A (en) Control method of washing machine and washing machine
CN112144222B (en) Washing reminding processing method and device, washing machine and storage medium
CN109371644A (en) A kind of method, system and washing machine reducing clothing fold
CN111241921A (en) Message reminding method and device for Internet of things operating system
CN115167159A (en) Control method of intelligent nursing equipment, electronic equipment and control system
CN113802324A (en) Household appliance interconnection control method, device, server and storage medium
CN110658744B (en) Control method, device and system of intelligent equipment, electronic equipment and storage medium
CN108592306A (en) Electric control method, device and air-conditioning
CN112989870A (en) Fan control method, fan and computer readable storage medium
CN114234390B (en) Method and device for controlling air conditioner, air conditioner and storage medium
CN111835807A (en) Control method of clothes treatment equipment
CN113671852A (en) Control method based on clothes management system and clothes management system
CN114438708A (en) Control method and device of washing machine, washing machine and storage medium
CN114561778A (en) Washing machine control method and washing machine
CN112651471A (en) Prompting method and device for clothes information
CN114155691B (en) Prompt message generation method and device and electronic equipment
CN114908530B (en) Intelligent clothes hanger, control method, electronic equipment and storage medium
CN116954091A (en) Clothes management method, equipment, system and storage medium
CN114885092A (en) Control method and device of image acquisition device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination