CN111817929B - Equipment interaction method and device, household equipment and storage medium - Google Patents

Equipment interaction method and device, household equipment and storage medium Download PDF

Info

Publication number
CN111817929B
CN111817929B CN202010482975.8A CN202010482975A CN111817929B CN 111817929 B CN111817929 B CN 111817929B CN 202010482975 A CN202010482975 A CN 202010482975A CN 111817929 B CN111817929 B CN 111817929B
Authority
CN
China
Prior art keywords
user
identity information
interaction
target
current user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010482975.8A
Other languages
Chinese (zh)
Other versions
CN111817929A (en
Inventor
袁珊娜
赵雪
岳长琴
李彭安
隋俊华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Smart Technology R&D Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Smart Technology R&D Co Ltd
Priority to CN202010482975.8A priority Critical patent/CN111817929B/en
Publication of CN111817929A publication Critical patent/CN111817929A/en
Application granted granted Critical
Publication of CN111817929B publication Critical patent/CN111817929B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2807Exchanging configuration information on appliance services in a home automation network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Cardiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physiology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Psychiatry (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Human Computer Interaction (AREA)
  • Educational Technology (AREA)
  • Pulmonology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a device interaction method, a device, household equipment and a storage medium. The method comprises the following steps: receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user; determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user; and after corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode. By adopting the method, the intelligence of man-machine interaction can be improved.

Description

Equipment interaction method and device, household equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a device interaction method, a device, a home device, and a storage medium.
Background
Along with the continuous development of artificial intelligence technology and internet of things, more and more intelligent households appear, and intelligent households refer to connecting various devices in the home together through internet of things, provide home appliance control, environmental monitoring, burglar alarm and other functions, can realize all-round information interaction, and it has brought very big convenience for people's life.
In the related art, before the smart home leaves the factory, the manufacturer has set the interaction mode of the smart home and the person in the smart home, and then when the user uses the smart home in the home, no matter which user is, the smart home interacts with any user according to the preset unified interaction mode.
However, the above technology has the problem of poor man-machine interaction intelligence.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a device interaction method, device, home device, and storage medium that can improve interaction intelligence by man-machine.
A method of device interaction, the method comprising:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
And after the corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the outputting the target response data to the current user based on the target interaction mode includes:
Collecting sign data of a current user, and determining the emotion state of the current user according to the sign data;
and outputting the target response data to the current user according to the emotion state and the target interaction mode of the current user.
In one embodiment, the outputting the target response data to the current user according to the emotional state and the target interaction mode of the current user includes:
Fusing the emotion state of the current user and the target interaction mode to obtain a target response mode;
And outputting the target response data to the current user by adopting a target response mode.
In one embodiment, the target interaction pattern includes at least one of: the method comprises the steps of adopting a mode of interaction with custom names corresponding to a current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
In one embodiment, the method for establishing the mapping relationship includes:
receiving a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user;
changing the built-in factory interaction mode into an interaction mode required by a user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the method for establishing the mapping relationship includes:
Acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of movement data, life data, entertainment data and shopping data of the user;
generating an interaction mode corresponding to each user according to preference data of each user;
And obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, if the current user is a plurality of first users, determining, based on identity information of the current user, a target interaction mode corresponding to the identity information of the current user in a preset mapping relationship includes:
Determining the identity information of a target user according to the identity information of a plurality of first users;
And determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the determining the identity information of the target user according to the identity information of the plurality of first users includes:
according to the identity information of the plurality of first users, determining overlapping identity information in the identity information of the plurality of first users;
And determining the overlapped identity information as the identity information of the target user.
In one embodiment, the determining the identity information of the target user according to the identity information of the plurality of first users includes:
acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment;
And determining the identity information of the first user with the highest priority as the identity information of the target user.
A device interaction apparatus, the apparatus comprising:
The receiving module is used for receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
the determining module is used for determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
and the output module is used for outputting the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data.
A home device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
And after the corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
And after the corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
The equipment interaction method, the equipment interaction device, the household equipment and the storage medium receive an interaction triggering instruction input by a current user, wherein the interaction triggering instruction comprises identity information and interaction data of the current user, a target interaction mode corresponding to the identity information of the current user is determined in a preset mapping relation based on the identity information of the current user, the mapping relation comprises a plurality of user identity information and interaction modes corresponding to the identity information of each user, and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode. In the method, the exclusive personalized interaction mode of each user can be obtained according to the pre-established mapping relation, so that after the response data corresponding to the interaction data are obtained, the response data can be output by adopting the exclusive interaction mode of the user instead of the unified interaction mode of all users, and therefore, the method can meet the personalized requirements of different users, and the interaction intelligence between the users and the equipment can be improved.
Drawings
FIG. 1 is an application environment diagram of a device interaction method in one embodiment;
FIG. 2 is a flow diagram of a method of device interaction in one embodiment;
FIG. 3 is a flow chart of a method of device interaction in another embodiment;
FIG. 4 is a flow chart illustrating the establishment of a mapping relationship in one embodiment;
FIG. 5 is a flowchart illustrating a mapping relationship establishment in another embodiment;
FIG. 6 is a flow chart of a method of device interaction in another embodiment;
FIG. 7 is a block diagram of the structure of the device interaction apparatus in one embodiment;
Fig. 8 is an internal structural diagram of a home device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
At present, before the smart home leaves the factory, a manufacturer usually sets an interaction mode of the smart home and a person in the smart home, and then when a user uses the smart home at home, no matter which user is, the smart home interacts with any user according to a preset unified interaction mode. Therefore, the technology has the problem of poor man-machine interaction intelligence. The application provides a device interaction method and device, household equipment and a storage medium, and solves the technical problems.
The device interaction method provided by the application can be applied to an application environment shown in figure 1. The user device 102 may communicate with the home device 104 (illustrated in fig. 1 by an intelligent refrigerator), the user device 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the home device 104 may be, but is not limited to, an intelligent refrigerator, an intelligent air conditioner, an intelligent rice cooker, an intelligent microwave oven, an intelligent water heater, an intelligent television, an intelligent sound box, an intelligent curtain, an intelligent lamp curtain wall, an intelligent wallpaper, and the like in a user's home. In addition, the user device 102 and the home device 104 may also communicate through the server 106, and interaction data between the user device 102 and the home device 104 may be forwarded through the server 106, where the server 106 may be implemented by a separate server or a server cluster formed by multiple servers.
The execution main body of the embodiment of the present application may be a home device or an equipment interaction device, and the method of the embodiment of the present application is described below with reference to the home device as the execution main body.
In one embodiment, a device interaction method is provided, and this embodiment relates to a specific process of how to implement personalized interaction between home devices and users. As shown in fig. 2, the method may include the steps of:
S202, receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user.
The interactive trigger instruction may be a voice instruction, a touch instruction, or the like. The identity information of the user may be a model number of the user's terminal (the terminal may be a mobile phone, a tablet computer, a notebook computer, a smart band, etc.), face information of the user, voice information of the user, a certificate number of the user (the certificate number may be an identity card number, a driver's license number, etc.), a mobile phone number of the user, etc. The interaction data can be a problem presented by a user to the household equipment, can be a start-up white call operation and the like which are input by the user to the household equipment and are used for starting communication, and can be other forms of data.
In addition, the interactive data of the current user can be input directly on the home device by the current user in a voice or touch mode, or can be input by the current user on the terminal in a voice or touch mode, and the interactive data is forwarded to the home device by the server, and of course, the interactive data can also be in other modes. The identity information of the current user can be obtained by identifying the face information of the current user through a camera on the home equipment, or can be obtained by identifying the voice of the current user through the home equipment, or can be obtained by identifying the model of the terminal of the current user through the home equipment, or can be even obtained by inputting the identity information on the terminal of the current user, and can be forwarded to the home equipment through a server, or can be obtained in other modes, and the embodiment is not limited in particular.
S204, determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user.
The mapping relationship may be obtained by performing custom configuration according to a configuration instruction input by a user, or may be generated according to preference data of the user.
Optionally, the target interaction mode includes at least one of the following: the method comprises the steps of adopting a mode of interaction with custom names corresponding to a current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
That is, each user may have a dedicated personalized interaction mode, which are stored in the mapping relationship in advance, and each user interaction mode may be a mode that uses at least one of custom naming, personality, character, tone, etc., and if the user interaction mode includes a plurality of custom interaction modes, the plurality of custom interaction modes may be generally packaged together as a total interaction mode.
Specifically, after obtaining the identity information of the current user, the home device may match the identity information of the current user with a plurality of identity information in the mapping relationship to obtain matched identity information, and record an interaction mode corresponding to the matched identity information as a target interaction mode.
S206, after corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In this embodiment, after obtaining the interaction data of the current user, the home device may find the target response data corresponding to the interaction data in the built-in session library. The built-in session library may be built up in advance from historical interaction data and corresponding historical response data.
After the target response data and the target interaction mode are obtained, part of the data in the target response data can be changed into custom data in the target interaction mode, or the target response data can be completely converted into response data in the target interaction mode, and then the changed or converted response data is output to the current user.
For example, assuming that the current user imparts a character to the home device, such as a robust man or a weak woman, the home device may output when outputting the target response data: the robust man or the infirm woman prompts you, how much today, and so on. Or the child can name the air conditioner in the home as the name of the sun or the other classmates, when the child opens the air conditioner for a certain time, the response data are combined, and the air conditioner feeds back: sun wu serves you, the target temperature is 25 ℃, the current room temperature is 32 ℃, and the sun wu is too hot to take a tiger on the mountain today (can be obtained through a correlation technique); when the next time the child starts the air conditioner, the air conditioner feeds back: sun wu reminds you, today high temperature alert, asphalt pavement temperature has reached 43 degrees celsius, and so on. When the response data is output to the user, the output mode can be voice output, or can be the output mode of displaying the response data to the current user on a display screen of the household equipment, or can be the output mode of displaying the response data and combining the voice, or can be the mode of forwarding the response data to a terminal of the current user through a server, and then displaying and outputting the response data through the voice and/or the display screen of the terminal. During voice output, the intelligent sound box or other intelligent equipment with audio processing can be used for realizing voice output through dialogue or linkage; when the display output is carried out, the intelligent lamp curtain wall, the intelligent wallpaper, the display screen and the like can be adopted for future realization. Of course, the simulation robot can also be used for outputting response data, the appearance of the simulation robot can be generated by 3D printing according to the preference of a user, or the appearance can be updated in real time every year or every month, for example, purchasing links are sent regularly, and the like.
Of course, when the user goes home or goes home, the user may not actively input the interaction data, so long as the home device detects that the user goes home or goes home through linkage, the interaction data input by the user may default to be going home or going home, then the user may be output the content of going home welcome "," smooth "," careful on the road "and the like through the pre-recorded audio and video, or the user may output the content of going home welcome", "smooth", "careful on the road" and the like through the target interaction mode corresponding to the identity information of the user, and may associate the weather information, after the content of going home is output, the user may be output the weather condition of the day or the last days, and even may output the corresponding prompt information and the like according to the weather condition. Therefore, the user can feel communication with a real person, and the solitary feeling of the user is solved, so that the intelligence of man-machine interaction can be improved, and the user experience is improved.
In the device interaction method, an interaction triggering instruction input by a current user is received, the interaction triggering instruction comprises identity information and interaction data of the current user, a target interaction mode corresponding to the identity information of the current user is determined in a preset mapping relation based on the identity information of the current user, the mapping relation comprises a plurality of user identity information and interaction modes corresponding to the identity information of each user, and after corresponding target response data are determined according to the interaction data, the target response data are output to the current user based on the target interaction mode. In the method, the exclusive personalized interaction mode of each user can be obtained according to the pre-established mapping relation, so that after the response data corresponding to the interaction data are obtained, the response data can be output by adopting the exclusive interaction mode of the user instead of the unified interaction mode of all users, and therefore, the method can meet the personalized requirements of different users, and the interaction intelligence between the users and the equipment can be improved.
In another embodiment, another device interaction method is provided, and this embodiment relates to a specific process of how a home device interacts with a user based on the emotional state of the user. On the basis of the above embodiment, as shown in fig. 3, the step of outputting the target response data to the current user based on the target interaction pattern in S206 may include the steps of:
S302, collecting sign data of the current user, and determining the emotion state of the current user according to the sign data.
The physical sign data can be the data of the body temperature, pulse, blood pressure, respiration and the like of the user, and can be obtained by arranging corresponding sensors in household equipment and collecting the human body by adopting the arranged sensors.
Correspondingly, the inside of the household equipment can be also provided with a threshold range corresponding to each piece of sign data, after the household equipment obtains the sign data of the current user by each sensor, the sign data of the current user and the corresponding threshold range can be matched, and the emotional state of the user can be obtained according to the matching result. For example, in one possible implementation, if the sign data exceeds a threshold range, the current user's emotional state may be considered bad, e.g., the emotional state is low or excited; further, if the sign data of the current user exceeds the threshold range, the emotional state of the user may be considered to be low if the sign data belongs to the side with the smaller threshold, and the emotional state of the user may be considered to be excited if the sign data belongs to the side with the larger threshold. In another possible embodiment, the current user's emotional state may be considered normal if the sign data does not exceed the threshold range.
S304, outputting target response data to the current user according to the emotion state and the target interaction mode of the current user.
In this step, after obtaining the emotional state and the target interaction mode of the current user, optionally, the emotional state and the target interaction mode of the current user may be fused to obtain a target response mode; and outputting the target response data to the current user by adopting a target response mode.
That is, after obtaining the emotion state of the current user, in order to avoid the cold single word pronunciation of the mechanical pronunciation, the rhythm and the tone change can be given to the voice, that is, the tone and the rhythm corresponding to the emotion state can be obtained in the home equipment, then the found tone and the found rhythm are combined with the naming, personality, character, tone color and the like in the target interaction mode to obtain a combined response mode, the combined response mode is recorded as the target response mode, the target response data is converted into the target response mode, and the target response mode is output to the current user after conversion.
By way of example, assume that if the current user's emotional state is excited, then a moderating tone and cadence may be employed to moderate the user's emotion; or if the current user's emotional state is low, the encouraging tone and rhythm may be employed to encourage the user; or the corresponding audio and video can be found according to the emotion state of the user, and then the user is prompted by using the target interaction mode and the target response data, and then the corresponding audio and video and the like are actively played. Or the physical condition of the user can be judged according to the emotional state of the user, and when the physical condition of the user is inappropriate, the user can be concerned more, for example: the great owners seem to have a small cold, and according to the current social condition, influenza is possible, and … … is recommended to take; or depending on your am clothing and the weather conditions of today, it may be cool, recommended … …, and so on.
According to the equipment interaction method provided by the embodiment, the emotion state of the current user can be obtained by collecting the sign data of the current user, and the target response data is output to the current user based on the emotion state and the target interaction mode of the current user. In this embodiment, the response data output to the user not only can be combined with the target interaction mode of the exclusive user, but also can be combined with the current emotional state of the user, so that when the method of this embodiment is adopted for human-computer interaction, the actual requirement and the actual physical condition of the current user can be more met, that is, the method is more personified, and therefore the intelligence of human-computer interaction can be further improved.
When the human-computer interaction is actually performed, the target interaction mode corresponding to the identity information of the current user can be obtained through the mapping relation between the identity information and the interaction mode, and then the mapping relation needs to be established before the mapping relation is used, or the mapping relation needs to be determined, and how to establish the mapping relation is described in two implementable modes, namely, the mapping relation is obtained through user-defined configuration, and the mapping relation is generated through preference data of the user.
First, a specific process of how the home device obtains the mapping relationship between the identity information and the interaction mode through user-defined configuration is described. On the basis of the above embodiment, as shown in fig. 4, the method for establishing the mapping relationship may include the following steps:
S402, receiving a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user.
S404, changing the built-in factory interaction mode into an interaction mode required by a user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In this embodiment, the user may directly input a configuration instruction on the home device through a direct voice or a touch form, for example, the user speaks a configuration state into the home device through voice, or touches a configuration button on a display screen of the home device, and inputs the configuration instruction through voice or touch form after the home device enters the configuration state; or the user inputs the configuration instruction through the simulated voice or touch on the interface of the household equipment on the terminal, and forwards the configuration instruction to the household equipment through the server; of course, the household equipment can be controlled to enter the configuration state by touching or clicking a button on the remote controller, and a configuration instruction is input to the household equipment by clicking or touching the button.
After receiving a configuration instruction input by a user, the home equipment can acquire that the configuration instruction comprises the identity information of the user and the interaction mode required by the user, wherein the interaction mode required by the user can be at least one of required naming, personality, character and tone, then the home equipment can call out a built-in factory interaction mode (namely, the built-in initial interaction mode can comprise naming, personality, character and tone which are set when factory, and the like), the mode which is the same as the interaction mode required by the user in the factory interaction mode is changed into the interaction mode required by the user, the changed interaction mode is obtained, the identity information of the user and the corresponding changed interaction mode can be bound, and the operation can be executed on the configuration instruction input by any user, so that the mapping relation between the identity information and the interaction mode can be obtained. When the subsequent user interacts with the home equipment, the interaction mode set in the mapping relation can be adopted by the server to interact with the user.
The mapping relation between the identity information of the user and the interaction mode is obtained through the user-defined configuration, and the user can customize the configuration, so that personalized requirements of different users can be met, the configured interaction mode is more accurate, and the experience of the user is improved.
Next, a specific procedure of how the home device generates the mapping relationship between the identity information and the interaction pattern from the preference data of the user will be described. On the basis of the above embodiment, as shown in fig. 5, the method for establishing the mapping relationship may include the following steps:
S502, acquiring identity information of different users and preference data of each user; the preference data of each user includes at least one of movement data, life data, entertainment data, shopping data of the user.
S504, generating an interaction mode corresponding to each user according to preference data of each user.
S506, obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In this embodiment, on the premise of big data, more auxiliary information can be collected as much as possible through social media, articles or communication forms with different users, so as to obtain identity information and corresponding preference data of each user. The preference data of each user includes sports data, life data, entertainment data, shopping data, and the like. The life data comprise data such as the time of getting up, the time of eating, the time of resting, dressing collocation and the like; the exercise data include running, swimming, playing ball, etc.; entertainment data includes data of television programs, music, particularly pay music, games, particularly electronic games, and the like; shopping data includes data of outgoing meals, beverages, supermarket purchases, takeaway, and the like.
After obtaining the preference data of each user, a big data analysis method can be adopted to analyze the data of names, characters, personality, tone and the like of the preferences of the user, and obtain the exclusive interaction mode of each user, including at least one of exclusive names, characters, personality, tone and the like. Illustratively, the user is a secondary home man, and then the personality of the home device is proximate to the secondary preferred by the user; the user likes a drama, so that the personality and character of the home device are close to a certain character in the TV drama recently watched by the user, the name of the home device can be set to the name of the character, the tone of the home device can be set to the tone of the character, and the like. In addition, when generating the interaction pattern corresponding to each user, a part of interaction patterns such as naming, character, personality, tone, etc. may be generated, and the other part may be the factory interaction pattern of the home equipment.
Further, after the identity information and the exclusive interaction mode of each user are obtained, the identity information of each user and the corresponding interaction mode of each user can be bound to obtain a mapping relationship.
According to the method, the user-specific interaction mode is generated through the user preference data, and the mapping relation is obtained through the user-specific interaction mode and the corresponding identity information, so that manual operation of the user is not needed in the process, labor and time can be saved, meanwhile, the mapping relation is obtained according to the user preference data, the obtained mapping relation is accurate, and the user experience degree can be improved.
According to the equipment interaction method provided by the embodiment, the mapping relation between the identity information of the user and the interaction mode can be obtained through user-defined configuration or according to preference data of the user. In this embodiment, because the user-defined configuration and the preference data of the user are matched with the requirements of the user, the mapping relationship obtained in this embodiment can meet the personalized requirements of different users, and the interaction mode in the obtained mapping relationship is also accurate, so that the experience of the user can be improved.
It should be noted that, in the above embodiments, the case of human-computer interaction is performed when the current user is one user, but in an actual scenario, there is a high probability that there are multiple users at the current time in the home, and how is the human-computer interaction performed in this case? The following will describe one specific embodiment.
In another embodiment, another device interaction method is provided, and this embodiment relates to how to obtain a target interaction mode based on identity information of a plurality of first users if the current user is the plurality of first users, so as to implement a specific process of man-machine interaction. On the basis of the above embodiment, as shown in fig. 6, the step S204 may include the following steps:
s602, according to the identity information of a plurality of first users, determining the identity information of the target user.
The plurality of first users may be a plurality of different terminals of the same user, or may be a plurality of real different persons, for example, old people, children, parents, etc. in the home, where each first user has its own identity information, and the identity information may include occupation, sex, age, height, learning level, etc. of each first user in addition to the information content mentioned in S202.
As can be seen from the above, the identity information of each first user may be partially the same, but generally not all the first users may be the same, which is equivalent to that there are a plurality of different identity information, but one home device can only interact with the user in one target interaction mode at the same time, so that it is required to determine the identity information of one target user, and in the determining process, the following modes of scene one and scene two may be adopted for determining:
Scene one, according to the identity information of a plurality of first users, determining the overlapped identity information in the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
For example, assuming that the identity information of the first user a includes a professional white collar, the age is 28, the height is 170, and the identity information of the first user B includes a professional white collar, the age is 32, the overlapping identity information in the identity information of the two users is the professional white collar, and the professional white collar may be determined as the identity information of the target user.
Scene two, obtaining the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
The sequence of each first user when using the same household equipment can be configured by an administrator of the equipment, when the household equipment obtains the identity information of each first user, the priority configured by the administrator for each first user can also be obtained, then the priority of the identity information of each first user can be ordered, and the identity information of the first user with the highest priority in the ordering result is determined as the identity information of the target user.
S604, determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
After the identity information of the target user is obtained, the identity information of the target user and a plurality of identity information in the mapping relation can be matched to obtain matched identity information, and an interaction mode corresponding to the matched identity information is used as an interaction mode of the target user and is recorded as a target interaction mode. Meanwhile, after response data corresponding to the interaction data are obtained, the response data can be output to a plurality of first users based on the target interaction mode.
Of course, in this embodiment, different management rights of different home devices may be allocated to different users, for example, the television belongs to the old, the computer belongs to the children, and these are the same as the case that one user corresponds to one home device, which is not described herein.
According to the equipment interaction method provided by the embodiment, if a plurality of users use the household equipment at the current moment, the identity information of the target user can be determined first, and then the target interaction mode corresponding to the identity information of the target user is obtained, so that interaction is performed by adopting the target interaction mode and the plurality of users. In this embodiment, the identity information of the target user may be determined from the identity information of multiple users, so as to perform man-machine interaction in an interaction mode corresponding to the identity information of the target user, thereby meeting the scene requirement that multiple people use the same home device.
In another embodiment, in order to facilitate a more detailed description of the technical solution of the present application, the following description is provided in connection with a more detailed embodiment, and the method may include the following steps S1-S10:
s1, establishing a mapping relation between identity information of a user and an interaction mode through user-defined configuration or preference data of the user.
S2, receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user.
S3, judging whether the current user is one user or a plurality of users according to the identity information of the current user in the interaction triggering instruction, if the current user is a plurality of users, executing S4, and if the current user is one user, executing S6.
S4, according to the identity information of the plurality of users, determining overlapped identity information in the identity information of the plurality of users, and determining the overlapped identity information as the identity information of the target user; or acquiring the priority of the identity information of each first user, and determining the identity information of the first user with the highest priority as the identity information of the target user.
S5, determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
S6, determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user.
S7, corresponding target response data are determined according to the interaction data of the current user.
S8, collecting sign data of the current user, and determining the emotion state of the current user according to the sign data.
And S9, fusing the emotion state of the current user and the target interaction mode to obtain a target response mode.
S10, outputting target response data to the current user in a target response mode.
It should be understood that, although the steps in the flowcharts of fig. 2-6 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in fig. 2-6 may include multiple steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the steps or stages in other steps or other steps.
In one embodiment, as shown in fig. 7, there is provided a device interaction apparatus, comprising: a receiving module 10, a determining module 11 and an output module 12, wherein:
the receiving module 10 is used for receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
The determining module 11 is configured to determine, based on identity information of a current user, a target interaction mode corresponding to the identity information of the current user in a preset mapping relationship; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
and the output module 12 is used for outputting the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data.
Optionally, the target interaction mode includes at least one of the following: the method comprises the steps of adopting a mode of interaction with custom names corresponding to a current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
For specific limitations of the device interaction means, reference may be made to the above limitations of the device interaction method, and no further description is given here.
In another embodiment, another device interaction apparatus is provided, and the output module 12 may include a first determining unit and an output unit, where:
The first determining unit is used for collecting sign data of the current user and determining the emotion state of the current user according to the sign data;
And the output unit is used for outputting the target response data to the current user according to the emotion state and the target interaction mode of the current user.
Optionally, the output unit is further configured to fuse an emotional state of the current user with the target interaction mode to obtain a target response mode; and outputting the target response data to the current user by adopting a target response mode.
In another embodiment, another device interaction apparatus is provided, where, based on the above embodiment, the apparatus may further include a setup module, where the setup module is configured to receive a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user; changing the built-in factory interaction mode into an interaction mode required by a user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
Optionally, the establishing module is further configured to obtain identity information of different users and preference data of each user; the preference data of each user comprises at least one of movement data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In another embodiment, another device interaction apparatus is provided, where, based on the foregoing embodiment, if the current user is a plurality of first users, the determining module 11 may include a second determining unit and a third determining unit, where:
The second determining unit is used for determining the identity information of the target user according to the identity information of the plurality of first users;
and the third determining unit is used for determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
Optionally, the second determining unit is further configured to determine overlapping identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
Optionally, the second determining unit is further configured to obtain a priority of identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
For specific limitations of the device interaction means, reference may be made to the above limitations of the device interaction method, and no further description is given here.
The modules in the device interaction apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a home device is provided, the internal structure of which may be as shown in fig. 8. The household equipment comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the home device is configured to provide computing and control capabilities. The memory of the household equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the home device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a device interaction method. The display screen of the household equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the household equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the household equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the household apparatus to which the present inventive arrangements are applied, and that a particular household apparatus may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a home device is provided, comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
And after the corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the processor when executing the computer program further performs the steps of:
Collecting sign data of a current user, and determining the emotion state of the current user according to the sign data; and outputting the target response data to the current user according to the emotion state and the target interaction mode of the current user.
In one embodiment, the processor when executing the computer program further performs the steps of:
Fusing the emotion state of the current user and the target interaction mode to obtain a target response mode; and outputting the target response data to the current user by adopting a target response mode.
In one embodiment, the target interaction pattern includes at least one of: the method comprises the steps of adopting a mode of interaction with custom names corresponding to a current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
In one embodiment, the processor when executing the computer program further performs the steps of:
Receiving a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user; changing the built-in factory interaction mode into an interaction mode required by a user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the processor when executing the computer program further performs the steps of:
Acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of movement data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, the processor when executing the computer program further performs the steps of:
Determining the identity information of a target user according to the identity information of a plurality of first users; and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the processor when executing the computer program further performs the steps of:
According to the identity information of the plurality of first users, determining overlapping identity information in the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
In one embodiment, the processor when executing the computer program further performs the steps of:
Acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user;
And after the corresponding target response data is determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Collecting sign data of a current user, and determining the emotion state of the current user according to the sign data; and outputting the target response data to the current user according to the emotion state and the target interaction mode of the current user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Fusing the emotion state of the current user and the target interaction mode to obtain a target response mode; and outputting the target response data to the current user by adopting a target response mode.
In one embodiment, the target interaction pattern includes at least one of: the method comprises the steps of adopting a mode of interaction with custom names corresponding to a current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Receiving a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user; changing the built-in factory interaction mode into an interaction mode required by a user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain a mapping relation.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of movement data, life data, entertainment data and shopping data of the user; generating an interaction mode corresponding to each user according to preference data of each user; and obtaining a mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Determining the identity information of a target user according to the identity information of a plurality of first users; and determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
According to the identity information of the plurality of first users, determining overlapping identity information in the identity information of the plurality of first users; and determining the overlapped identity information as the identity information of the target user.
In one embodiment, the computer program when executed by the processor further performs the steps of:
Acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment; and determining the identity information of the first user with the highest priority as the identity information of the target user.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.

Claims (10)

1. A method of device interaction, the method comprising:
receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
Determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user; the target interaction mode of each user is an interaction mode determined by at least one of custom naming, personality, character and tone;
After corresponding target response data are determined according to the interaction data, outputting the target response data to the current user based on the target interaction mode;
the outputting the target response data to the current user based on the target interaction mode includes:
collecting sign data of a current user, and determining the emotion state of the current user according to the sign data;
outputting the target response data to the current user according to the emotion state of the current user and the target interaction mode;
If the current user is a plurality of first users, determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relationship based on the identity information of the current user, including:
determining the identity information of the target user according to the identity information of the plurality of first users;
determining a target interaction mode corresponding to the identity information of the target user in a preset mapping relation based on the identity information of the target user;
The determining the identity information of the target user according to the identity information of the plurality of first users comprises the following steps:
According to the identity information of the plurality of first users, determining overlapping identity information in the identity information of the plurality of first users;
and determining the overlapped identity information as the identity information of the target user.
2. The method of claim 1, wherein the outputting the target response data to the current user according to the emotional state of the current user and the target interaction pattern comprises:
fusing the emotion state of the current user and the target interaction mode to obtain a target response mode;
and outputting the target response data to the current user by adopting the target response mode.
3. The method according to claim 2, wherein said outputting the target response data to the current user using the target response method includes:
Changing part of data in the target response data into custom data in the target interaction mode, and outputting the custom data to the current user;
Or converting all the target response data into response data in the target interaction mode, and outputting the response data to the current user.
4. The method according to any of claims 1-2, wherein the target interaction pattern comprises at least one of: the method comprises the steps of adopting a mode of interaction with custom names corresponding to the current user, adopting a mode of interaction with custom personality corresponding to the current user, adopting a mode of interaction with custom characters corresponding to the current user and adopting a mode of interaction with custom timbre corresponding to the current user.
5. The method according to any one of claims 1-2, wherein the establishing manner of the mapping relationship includes:
Receiving a configuration instruction input by a user; the configuration instruction comprises the identity information of the user and the interaction mode required by the user;
Changing the built-in factory interaction mode into an interaction mode required by the user, and establishing a corresponding relation between the identity information of the user and the interaction mode required by the user to obtain the mapping relation.
6. The method according to any one of claims 1-2, wherein the establishing manner of the mapping relationship includes:
acquiring identity information of different users and preference data of each user; the preference data of each user comprises at least one of motion data, life data, entertainment data and shopping data of the user;
Generating an interaction mode corresponding to each user according to the preference data of each user;
And obtaining the mapping relation according to the interaction mode corresponding to each user and the identity information of each user.
7. The method of claim 1, wherein determining the identity information of the target user based on the identity information of the plurality of first users comprises:
Acquiring the priority of the identity information of each first user; the priority represents the sequence of each first user when using the equipment;
And determining the identity information of the first user with the highest priority as the identity information of the target user.
8. A device interaction apparatus, the apparatus comprising:
The receiving module is used for receiving an interaction triggering instruction input by a current user; the interaction triggering instruction comprises the identity information of the current user and the interaction data of the current user;
The determining module is used for determining a target interaction mode corresponding to the identity information of the current user in a preset mapping relation based on the identity information of the current user; the mapping relation comprises identity information of a plurality of users and interaction modes corresponding to the identity information of each user; the target interaction mode of each user is an interaction mode determined by at least one of custom naming, personality, character and tone;
the output module is used for outputting the target response data to the current user based on the target interaction mode after determining the corresponding target response data according to the interaction data;
The output module includes:
the first determining unit is used for collecting physical sign data of the current user and determining the emotion state of the current user according to the physical sign data;
the output unit is used for outputting the target response data to the current user according to the emotion state of the current user and the target interaction mode;
If the current user is a plurality of first users, the determining module includes a second determining unit and a third determining unit, where:
the second determining unit is used for determining the identity information of the target user according to the identity information of the plurality of first users; the method is also used for determining overlapped identity information in the identity information of the plurality of first users according to the identity information of the plurality of first users; determining the overlapped identity information as the identity information of the target user;
The third determining unit is configured to determine, based on identity information of the target user, a target interaction mode corresponding to the identity information of the target user in a preset mapping relationship.
9. A household device comprising a memory and a processor, said memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 7 when executing said computer program.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202010482975.8A 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium Active CN111817929B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010482975.8A CN111817929B (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010482975.8A CN111817929B (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111817929A CN111817929A (en) 2020-10-23
CN111817929B true CN111817929B (en) 2024-05-14

Family

ID=72848039

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010482975.8A Active CN111817929B (en) 2020-06-01 2020-06-01 Equipment interaction method and device, household equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111817929B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113137697A (en) * 2021-04-19 2021-07-20 青岛海尔空调电子有限公司 Control method and device of air conditioner and computer readable storage medium
CN113160832A (en) * 2021-04-30 2021-07-23 合肥美菱物联科技有限公司 Voice washing machine intelligent control system and method supporting voiceprint recognition
CN113934299B (en) * 2021-10-18 2024-01-30 珠海格力电器股份有限公司 Equipment interaction method and device, intelligent household equipment and processor
CN114137841B (en) * 2021-10-28 2024-03-22 青岛海尔科技有限公司 Control method, equipment and system of Internet of things equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN106682090A (en) * 2016-11-29 2017-05-17 上海智臻智能网络科技股份有限公司 Active interaction implementing device, active interaction implementing method and intelligent voice interaction equipment
CN107483493A (en) * 2017-09-18 2017-12-15 广东美的制冷设备有限公司 Interactive calendar prompting method, device, storage medium and intelligent domestic system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108073605A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of loading of business datum, push, the generation method of interactive information and device
CN108351707A (en) * 2017-12-22 2018-07-31 深圳前海达闼云端智能科技有限公司 Man-machine interaction method and device, terminal equipment and computer readable storage medium
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN109002022A (en) * 2018-08-16 2018-12-14 陕西卓居未来智能科技有限公司 A kind of cloud intelligent steward system and operating method based on interactive voice ability
CN109065035A (en) * 2018-09-06 2018-12-21 珠海格力电器股份有限公司 information interaction method and device
CN109409063A (en) * 2018-10-10 2019-03-01 北京小鱼在家科技有限公司 A kind of information interacting method, device, computer equipment and storage medium
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium
CN111177329A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 User interaction method of intelligent terminal, intelligent terminal and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN108073605A (en) * 2016-11-10 2018-05-25 阿里巴巴集团控股有限公司 A kind of loading of business datum, push, the generation method of interactive information and device
CN106682090A (en) * 2016-11-29 2017-05-17 上海智臻智能网络科技股份有限公司 Active interaction implementing device, active interaction implementing method and intelligent voice interaction equipment
CN107483493A (en) * 2017-09-18 2017-12-15 广东美的制冷设备有限公司 Interactive calendar prompting method, device, storage medium and intelligent domestic system
CN107819651A (en) * 2017-09-30 2018-03-20 深圳市艾特智能科技有限公司 Intelligent home equipment control method, device, storage medium and computer equipment
CN108351707A (en) * 2017-12-22 2018-07-31 深圳前海达闼云端智能科技有限公司 Man-machine interaction method and device, terminal equipment and computer readable storage medium
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN109002022A (en) * 2018-08-16 2018-12-14 陕西卓居未来智能科技有限公司 A kind of cloud intelligent steward system and operating method based on interactive voice ability
CN109065035A (en) * 2018-09-06 2018-12-21 珠海格力电器股份有限公司 information interaction method and device
CN109409063A (en) * 2018-10-10 2019-03-01 北京小鱼在家科技有限公司 A kind of information interacting method, device, computer equipment and storage medium
CN111177329A (en) * 2018-11-13 2020-05-19 奇酷互联网络科技(深圳)有限公司 User interaction method of intelligent terminal, intelligent terminal and storage medium
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN111817929A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111817929B (en) Equipment interaction method and device, household equipment and storage medium
US11353259B2 (en) Augmented-reality refrigerator and method of controlling thereof
KR102322034B1 (en) Image display method of a apparatus with a switchable mirror and the apparatus
US20160093081A1 (en) Image display method performed by device including switchable mirror and the device
US10991462B2 (en) System and method of controlling external apparatus connected with device
CN104932455B (en) The group technology and apparatus for grouping of smart machine in intelligent domestic system
US11732961B2 (en) Augmented-reality refrigerator and method of controlling thereof
CN107483493A (en) Interactive calendar prompting method, device, storage medium and intelligent domestic system
CN104731829B (en) A kind of interactive approach and device of network picture
US8909636B2 (en) Lifestyle collecting apparatus, user interface device, and lifestyle collecting method
CN108432190A (en) Response message recommends method and its equipment
KR20200085143A (en) Conversational control system and method for registering external apparatus
CN107490971A (en) Intelligent automation assistant in home environment
CN105893771A (en) Information service method and device and device used for information services
CN109754316A (en) Products Show method, Products Show system and storage medium
CN108132983A (en) The recommendation method and device of clothing matching, readable storage medium storing program for executing, electronic equipment
KR20180096182A (en) Electronic device and method for controlling the same
CN105278336A (en) Application display method and apparatus, and terminal
US10936140B2 (en) Method and device for displaying response
CN101548531A (en) Configurable personal audiovisual device for use in networked application-sharing system
KR20200094833A (en) Method and platform for providing ai entities being evolved through reinforcement machine learning
CN108744495A (en) A kind of control method of virtual key, terminal and computer storage media
CN113495487A (en) Terminal and method for adjusting operation parameters of target equipment
CN108401173A (en) Interactive terminal, method and the computer readable storage medium of mobile live streaming
CN108037699B (en) Robot, control method of robot, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant