CN112306238A - Method and device for determining interaction mode, electronic equipment and storage medium - Google Patents

Method and device for determining interaction mode, electronic equipment and storage medium Download PDF

Info

Publication number
CN112306238A
CN112306238A CN202011166110.7A CN202011166110A CN112306238A CN 112306238 A CN112306238 A CN 112306238A CN 202011166110 A CN202011166110 A CN 202011166110A CN 112306238 A CN112306238 A CN 112306238A
Authority
CN
China
Prior art keywords
target
determining
target object
interaction
behavior data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011166110.7A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202011166110.7A priority Critical patent/CN112306238A/en
Publication of CN112306238A publication Critical patent/CN112306238A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for determining an interactive mode, an electronic device and a storage medium, wherein the method comprises the following steps: if the target object is located in the data acquisition area, acquiring first behavior data of the target object; determining interactive feature data of the target object according to the first behavior data; and determining a target interaction mode corresponding to the target object according to the interaction feature data. According to the technical scheme of the embodiment, the information interaction mode is enriched, the information interaction efficiency is improved, and the technical effect of the information interaction experience of the target object is improved.

Description

Method and device for determining interaction mode, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, and in particular relates to a method and a device for determining an interaction mode, an electronic device and a storage medium.
Background
With the continuous progress of electronic technology, smart devices have become very popular. Most intelligent devices are provided with various sensors, and when a user needs to perform information interaction, the user can perform information interaction with the user through a selection interaction mode. However, the related interaction technology still has the technical problems of single information interaction mode, low information interaction efficiency and the like, and the information interaction experience of the user is reduced.
Disclosure of Invention
The embodiment of the disclosure provides a method and a device for determining an interaction mode, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a method for determining an interaction mode, where the method includes:
if the target object is located in the data acquisition area, acquiring first behavior data of the target object;
determining interactive feature data of the target object according to the first behavior data;
and determining a target interaction mode corresponding to the target object according to the interaction feature data.
In a second aspect, an embodiment of the present disclosure further provides an apparatus for determining an interaction mode, where the apparatus includes:
the first behavior data acquisition module is used for acquiring first behavior data of a target object if the target object is located in a data acquisition area;
the interactive feature data determining module is used for determining interactive feature data of the target object according to the first behavior data;
and the target interaction mode determining module is used for determining a target interaction mode corresponding to the target object according to the interaction characteristic data.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining the interaction mode according to any one of the embodiments of the present disclosure.
In a fourth aspect, the embodiments of the present disclosure further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform the method for determining an interaction mode according to any one of the embodiments of the present disclosure.
According to the technical scheme of the embodiment of the disclosure, when the target object is located in the data acquisition area, the first behavior data of the target object is acquired, the first behavior data of the target object can be automatically acquired after the target object appears, the interaction feature data of the target object is further determined according to the first behavior data, the interaction intention of the target object is determined according to the first behavior data of the target object, and finally the target interaction mode corresponding to the target object is determined according to the interaction feature data, so that the method is particularly suitable for the situation that the target object is located in the data acquisition area but does not actively send a clear interaction request, the initiative of the interaction equipment or the interaction system in an interaction scene is fully improved, the intelligent interaction of the interaction equipment or the interaction system and the target object is realized, and the problem that the information interaction mode in the related technology is single is solved, the information interaction method has the advantages that the technical problems of low information interaction efficiency and the like are solved, so that the information interaction modes are enriched, the information interaction efficiency is improved, and the technical effects of the information interaction experience of the target object are improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a schematic flowchart of a method for determining an interaction mode according to a first embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a method for determining an interaction mode according to a second embodiment of the disclosure;
fig. 3 is a schematic flowchart of a method for determining an interaction mode according to a third embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a method for determining an interaction mode according to a fourth embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a method for determining an interaction mode according to a fifth embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a preferred example of a method for determining an interaction mode according to a sixth embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an interactive mode determining device according to a seventh embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an eighth embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
Example one
Fig. 1 is a schematic flowchart of a method for determining an interaction mode according to a first embodiment of the present disclosure, and the first embodiment of the present disclosure is particularly suitable for a situation where an information interaction mode is automatically selected to interact with a user. Here, the server may be a single server, or may be a server cluster composed of a plurality of servers connected in communication and capable of providing one or more functions.
As shown in fig. 1, the method for determining an interaction manner provided in this embodiment may specifically include:
and S110, if the target object is located in the data acquisition area, acquiring first behavior data of the target object.
The target object is an object used for determining an interaction mode. Illustratively, the target object may include a target user and a target object. Typically, the target user includes a target person. Optionally, the data acquisition region is determined based on acquisition parameters of the data acquisition device, wherein the acquisition parameters may include an acquisition range, an acquisition mode, and the like. The first behavior data may include static behavior data and dynamic behavior data. Taking a target object as an example of a target user, the first behavior data may include facial data, gesture data, eye data, body posture data, and sound data of the target user, and the like. Taking a target object as an example of a target object, the first behavior data may include a display state of the target object, a display content of the target object, and the like. For example, when the target object is a book, the display state of the book may be an open state or a closed state; the display content of the book may include text content and/or picture content. It is understood that the data capable of characterizing the behavior state of the target object can be used as the first behavior data, and the first behavior data is only exemplary and not limiting.
In the embodiment of the present disclosure, the target object is located in a data acquisition area, which may be understood as first behavior data that may be acquired by a data acquisition device to the target object in the data acquisition area. For example, the first behavior data of the target object may be acquired by a data acquisition device such as a camera, a sound acquisition device, and an infrared sensing device. It should be noted that whether the first behavior data is actually collected or not and what behavior data is collected are determined according to the actual collection result. When the target object is located in the data acquisition area, the acquired behavior data of the target object is used as first behavior data, and therefore the first behavior data of the target object is acquired.
It should be noted that the number of the target objects may be one, or two or more. If the number of the target objects is two or more, the first behavior data may include independent behavior data of each target object, or behavior data of interaction between the target objects. For example, in a learning scenario, the target objects may be hands, pens, and workbooks.
And S120, determining the interactive feature data of the target object according to the first behavior data.
In a practical application scenario, the acquired first behavior data is often various and is generally the original behavior expression data of the target object. Thus, the first behavior data may be analyzed to determine interaction feature data of the target object. The interactive feature data may be understood as feature data in the first behavior data, in which an interactive intention can be recognized or an interactive response condition can be satisfied. For example, specifically, feature extraction may be performed on the first behavior data based on an interaction data type of preset interaction data, and the extracted behavior data belonging to the interaction data type is used as interaction feature data. Wherein the interaction data type is related to an object type of the target object.
For example, the interaction data types may include object attribute data, mood data, body posture data, concentration status data, and voice data, etc. for the target object as the target user. Wherein the object attribute data may include gender, age, etc. of the target object. For example, the mood data may include facial expression data and/or limb movement data, etc. Optionally, the body posture data may include a standing position, a sitting position, a lying position, or the like. Illustratively, the concentration status data may include eye data and/or limb motion data, among others. Illustratively, the sound data may include voice data and/or noise data, and the like.
The interactive feature data of the target object may be determined based on one of the first behavior data of the target object, or may be determined based on a combination of two or more of the first behavior data of the target object. Optionally, when a combination of two or more items of behavior data is adopted, weights may be set for each item of behavior data according to an interaction scene, an interaction object, and/or a data acquisition manner, and the like, the various items of behavior data are weighted, and then interaction feature data of the target object is determined by the weighted various items of behavior data.
And S130, determining a target interaction mode corresponding to the target object according to the interaction feature data.
And determining interactive characteristic data according to the first behavior data of the target object, and determining a target interactive mode for information interaction with the target object according to the specific content of the interactive characteristic data. Namely, the behavior intention of the target object is determined through the interaction feature data, and then the target interaction mode corresponding to the behavior intention is automatically determined. The target interaction mode may include a target interaction device and/or target interaction content, and the like.
Optionally, before determining the target interaction mode corresponding to the target object according to the interaction feature data, the method further includes: and pre-establishing a corresponding relation between each item of feature interaction data and at least one target interaction mode. That is, at least one target interaction mode may be set for each item of feature interaction data, and the target interaction modes corresponding to different feature interaction data may be the same or different, and are not specifically limited herein.
For example, determining a target interaction mode corresponding to the target object according to the interaction feature data may include: and determining a target interaction state of a target object according to at least one item of the interaction characteristic data, and determining a target interaction mode corresponding to the target object based on the target interaction state. The target interaction state can comprise a state which can represent the behavior intention of the user, such as posture information, emotion information and the like of the target object.
For example, if the target object is a target user and the target interaction state of the target user is determined to be a carefree state according to at least one item of the interaction feature data, the determined target interaction mode of the target object may be a carefree alleviation interaction mode according to the carefree state of the target object. For example, characters, pictures, animation, and the like showing warm comforting may be displayed on the display device, a sentence related to or comforting may be played, a song for relaxing mood, and the like may be played.
And if the target object is a target book and the target interaction state of the target book is determined to be a reading state, an analysis state, a recording state or an uploading state and the like according to at least one item of the interaction characteristic data, determining that the target interaction mode corresponding to the target book can be respectively starting a book reading function, an analysis function, a recording function or an uploading function and the like according to the book content. The embodiment of the present invention does not specifically limit the specific target interaction mode for interacting with the target object.
According to the technical scheme of the embodiment of the disclosure, if the target object is located in the data acquisition area, the first behavior data of the target object is acquired, the first behavior data of the target object can be automatically acquired after the target object appears, the interaction feature data of the target object is automatically determined according to the first behavior data, the interaction intention of the target object is determined according to the first behavior data of the target object, and finally the target interaction mode corresponding to the target object is determined according to the interaction feature data, so that the method is particularly suitable for the situation that the target object is located in the data acquisition area but does not actively send out a clear interaction request, the initiative of the interaction equipment or the interaction system in an interaction scene is fully improved, the intelligent interaction between the interaction equipment or the interaction system and the target object is realized, and the problem that the information interaction mode in the related technology is single is greatly solved, the information interaction method has the advantages that the technical problems of low information interaction efficiency and the like are solved, so that the information interaction modes are enriched, the information interaction efficiency is improved, and the technical effects of the information interaction experience of the target object are improved.
Example two
Fig. 2 is a flowchart illustrating a method for determining an interaction mode according to a second embodiment of the disclosure. This embodiment may be combined with various alternatives of the above-described embodiments. In this embodiment, optionally, the target object includes a target user; the method may further comprise: determining identity information of the target object; accordingly, the determining of the interaction feature data of the target object from the first behavior data may comprise: and determining the interactive feature data of the target object according to the identity information and the first behavior data.
As shown in fig. 2, the method for determining the interaction mode provided in this embodiment may specifically include:
s210, if the target object is located in the data acquisition area, first behavior data of the target object are acquired.
S220, identity information of the target object is determined, and interactive feature data of the target object is determined according to the identity information and the first behavior data.
Before determining the interaction mode for information interaction with the target user, identity information of the target user can be determined first. Optionally, the target object includes a target user and/or a target object, and when the target object includes the target user, the identity information includes an age and/or a gender of the target user.
Optionally, determining the identity information of the target object includes: and acquiring data for determining the identity information of the target user through the data acquisition equipment, and determining the identity information of the target object through the acquired data. For example, image information of a target object may be acquired by an image acquisition device, and identity information of the target object may be determined according to the image information.
Illustratively, the determining identity information of the target object according to the image information includes: identifying the acquired image information, and determining the identity information of the target object according to the identification result; or comparing the acquired image information with pre-stored image information, and determining the identity information of the target object according to the comparison result.
As mentioned above, since the acquired first behavior data may be very rich, the identity information of the target object and the first behavior data may be combined to determine the interactive features of the target object. Optionally, determining the interaction feature data of the target object according to the identity information and the first behavior data includes: and determining the type of the interactive behavior according to the identity information, and determining the interactive feature data of the target object based on the first behavior data and the type of the interactive behavior. Wherein determining the interaction feature data of the target object based on the first behavior data and the interaction behavior type may include: and determining target behavior data belonging to the interactive behavior type in the first behavior data, and determining interactive feature data of the target object based on the target data. It is understood that the types of the interaction behaviors corresponding to the target objects with different identity information may be the same or different.
Taking the target object as a target user as an example, facial image information of the target user can be collected, and the age bracket of the target user is determined based on the facial image information; and determining the interactive behavior type of the target user according to the age group, and further determining the interactive feature data of the target object based on the first behavior data and the interactive behavior type. For example, if it is determined that the age group of the target user is 3 to 7 years based on the facial image information, the voice-type data and the limb-motion-type data may be determined as the type of the interactive behavior of the target user.
Taking the target object as the target object, the first behavior data whose interaction behavior type is the state change of the target object may be used as the target behavior data.
And S230, determining a target interaction mode corresponding to the target object according to the interaction feature data.
Optionally, the determining a target interaction manner corresponding to the target object according to the interaction feature data includes: and determining a preset interaction mode corresponding to the target object according to the identity information, and determining a target interaction mode in the preset interaction mode according to the first behavior data. The preset interaction mode may include various interaction information, for example, may include an interaction device, specific interaction content, and the like.
It can be understood that, before the determining the preset interaction mode corresponding to the target object according to the identity information, the method further includes: and establishing a corresponding relation between the identity information of the interactive object and a preset interactive mode. Wherein the interaction object comprises a target object. And then, a preset interaction mode corresponding to the target object can be determined according to the identity information and the interaction relation. It should be noted that the target interaction modes in different preset interaction modes may be the same or different.
Illustratively, if the target object is a target user and the identity information includes age information, the preset interaction mode may be a preset interaction mode corresponding to at least one age group, and may be classified into an adult mode and a child mode, for example. Specifically, a preset interaction mode corresponding to the target object may be determined according to age information of the target object, and then a target interaction mode in the preset interaction mode may be determined according to the first behavior data. Illustratively, the target age group where the target user is located is determined according to the age information of the target object, and if the preset interaction mode corresponding to the target age group is a child mode, the target interaction mode in the child mode is determined according to the first behavior data.
Illustratively, if the target object is a target book and the identity information includes book content, the preset interaction mode may be a reading mode, an uploading mode, an analyzing mode, or the like.
Of course, in this embodiment, the target interaction mode corresponding to the target object may also be determined only according to the interaction feature data, and identity information of the target object is not distinguished any more, and a technical scheme similar to S130 is adopted, and the foregoing may be referred to for a specific implementation mode, and details are not described here.
According to the technical scheme of the embodiment, the first behavior data of the target object are obtained, the identity information of the target object is determined, the interaction feature data of the target object are determined according to the identity information and the first behavior data, and the target interaction mode determined based on the interaction feature data can be more suitable for the personalized requirements of the target object, so that the user experience is greatly improved.
EXAMPLE III
Fig. 3 is a flowchart illustrating a method for determining an interaction mode according to a third embodiment of the present disclosure. This embodiment may be combined with various alternatives of the above-described embodiments. In this embodiment, optionally, after determining the target interaction mode corresponding to the target object according to the interaction feature data, the method further includes: and starting the target interaction equipment corresponding to the target interaction mode.
As shown in fig. 3, the method for determining the interaction mode of the present embodiment may specifically include:
s310, if the target object is located in the data acquisition area, first behavior data of the target object are acquired.
In the embodiment of the present disclosure, when the target object is located in the data acquisition region, the first behavior data of the target object may be acquired by at least one data acquisition device. That is, the first behavior data of the target object may be acquired by one, two, or more than two data acquisition devices, which is not limited in particular herein.
And S320, determining the interactive feature data of the target object according to the first behavior data.
And S330, determining a target interaction mode corresponding to the target object according to the interaction feature data.
Generally, the target interaction mode is realized based on a target interaction device. Different target interaction modes can be realized by different target interaction devices. For example, when the target interaction mode is to shoot a target image, the target interaction mode needs to be realized by depending on a shooting device; when the target interaction mode is to receive the sound information, the target interaction mode needs to be realized by depending on a sound acquisition device; when the target interaction mode is to receive a control instruction, the control instruction needs to be realized depending on a control element; when the target interaction mode is to monitor a change of certain data, it may need to rely on one or several sensor implementations, and so on.
And S340, starting the target interaction equipment corresponding to the target interaction mode.
The target interaction equipment can be understood as a data acquisition device corresponding to the target interaction mode. Similarly, there may be one, two or more than two target interaction devices corresponding to the target interaction mode. The specific number and the specific type of the target interaction devices are not particularly limited by the embodiments of the present disclosure. Illustratively, the target interaction device includes at least one of an infrared sensor, a camera, a control element, and a sound acquisition device; wherein the control element comprises a virtual control element and/or a physical control element.
It can be understood that starting the target interaction device corresponding to the target interaction mode includes: and if the target interaction equipment corresponding to the target interaction mode is in a closed state, opening the target interaction equipment. Of course, if the target interaction device corresponding to the target interaction mode is in the on state, the on state of the target interaction device is maintained.
After the target interaction mode for information interaction with the target object is determined, the target interaction equipment can be started in a targeted manner to realize the information interaction with the target object. Optionally, after the target interaction device corresponding to the target interaction mode is started, the method further includes: the remaining interactive devices are turned off. Namely, the interactive devices except the target interactive device corresponding to the target interactive mode are closed. The set method has the advantages that only the needed equipment is adopted for interaction, the unnecessary equipment is closed, electricity is saved, the occupation of server resources is reduced, and the resource configuration is optimized.
According to the technical scheme of the embodiment, after the target interaction mode corresponding to the target object is determined according to the interaction feature data, the target interaction equipment corresponding to the target interaction mode is started, and the target interaction equipment can be started in a targeted manner. The technical scheme is particularly suitable for the condition that a plurality of interactive devices are arranged, and compared with the prior art that all the interactive devices are started by default, the technical effect of optimizing resource allocation can be achieved.
Example four
Fig. 4 is a flowchart illustrating a method for determining an interaction mode according to a fourth embodiment of the disclosure. This embodiment may be combined with various alternatives of the above-described embodiments. In this embodiment, optionally, after the target interaction device corresponding to the target interaction mode is started, the method further includes: and acquiring second behavior data of the target object based on the target interaction equipment, and determining a target application program to be operated according to the second behavior data.
As shown in fig. 4, the method for determining the interaction mode of the present embodiment may specifically include:
and S410, if the target object is located in the data acquisition area, acquiring first behavior data of the target object.
Optionally, the first behavior data includes preset behavior data and autonomous behavior data. The preset behavior data includes preset control behavior data which is aimed at interaction and aims at controlling and starting some application program through the line of data, and can be simply understood as the preset control behavior data. Generally, the preset behavior data is often behavior data input in a targeted manner. The autonomous behavior data can be understood as the behavior data of the target object without clear interaction purpose.
Optionally, the target object includes a target user; the autonomic behavior data comprises state information and/or action information of the target user, wherein the state information comprises facial expression information, eye information and/or posture information; the posture information includes a sitting posture and a standing posture. The facial expression information may be daily calm, smiling, frowning, and grimy. The eye action information may include eye muscle activity information, eye opening and closing information, eye sclera state information, tear state information, pupil state information, and/or the like.
Taking the target object as a target user as an example, the preset behavior data may include inputting a preset voice control instruction, triggering a preset behavior of a control element following a specific rule, and inputting behavior data such as a specific interaction gesture; the autonomic behavior data may include facial expression data for the target user, and the like.
And S420, determining the interactive feature data of the target object according to the first behavior data.
And S430, determining a target interaction mode corresponding to the target object according to the interaction feature data.
And S440, starting the target interaction equipment corresponding to the target interaction mode.
S450, second behavior data of the target object is collected based on the target interaction device, and the target application program to be operated is determined according to the second behavior data.
And in the process of performing information interaction with the target object by adopting the determined target interaction mode, second behavior data of the target object can be collected through the target interaction equipment, and the corresponding target application program is executed in response to the second behavior data to realize the information interaction with the target object. The second behavior data may be the same as or different from the first behavior data. Similarly, the second behavior data may also include preset behavior data and autonomic behavior data. For the explanation of the preset behavior data and the autonomic behavior data, please refer to the explanation of the section S410, which is not described herein.
There are various ways of determining the target application to be run according to the second behavior data. For example, when information interaction is performed with a target user in a song playing manner, after the sound collection device collects the voice which is sent by the user and terminates playing similarly, that is, the second behavior data is the voice which is sent by the user and stops playing, and accordingly, the target application program to be run is determined according to the second behavior data of the user and is used for stopping playing the target application program of the current song. For example, the collected second behavior data is an operation of the target user to turn a page of the book, and the determined target application to be run may be a target application for reading text content of a next page. The specific data content of the second behavior data and the specific recommended target application are not limited herein.
Optionally, when the second behavior data includes preset behavior data, the acquiring, by the target interaction device, second behavior data of the target object, and determining the target application according to the second behavior data includes: and if the preset behavior data of the target object are acquired based on the target interaction equipment, executing the target application program corresponding to the preset behavior data. Illustratively, the preset behavior data is hand swing, the corresponding target application program is dance video playing, and if the shooting device acquires the hand swing behavior of the target object, the target application program playing the relevant dance video is correspondingly executed.
When the second behavior data further includes the autonomous behavior data, the acquiring, by the target-interaction-based device, second behavior data of the target object, and determining the target application according to the second behavior data includes: if the preset behavior data of the target object is not acquired based on the target interaction device within the preset time period, acquiring the autonomous behavior data of the target object, and determining the target application program to be operated according to the autonomous behavior data.
The autonomous behavior data refers to behavior actively expressed by the target object acquired by the target interaction device. If the target interaction equipment does not acquire the preset behavior data of the target object within the preset time period, the target interaction equipment actively acquires the autonomous behavior data of the target object, and determines the target application program to be operated according to the autonomous behavior data. For example, if the target user does not make a behavior action related to the preset behavior data within a preset time period, for example, the target user only stands in the data acquisition area of the target interaction device, facial expression information of the target user may be further obtained, and if it is determined that the target user is in a daily calm state according to the facial expression information, a target application program, such as display time and/or weather information, may be executed.
Optionally, determining a target application to be run according to the autonomic behavior data includes: and determining the working state of the target object according to the autonomous behavior data, and determining the target application program to be operated based on the working state. The operating state may include, for example, a continuous operating state and an out-of-operation state. When the target object is a target user, the off-working state may include a fatigue state, a distraction state, and the like.
Optionally, determining the working state of the target object according to the autonomous behavior data includes: and determining the working state of the target object according to the autonomous behavior data and the historical behavior data. Illustratively, the autonomous behavior data of the target object is compared with the historical behavior data, and the working state of the target object is determined according to the comparison result. For example, the autonomous behavior data of the target object may be analyzed to obtain a behavior rule of the target object and/or a target behavior feature corresponding to a working state, and the working state of the target object may be determined according to the autonomous behavior data and the behavior rule of the target object and/or the target behavior feature corresponding to the working state. The advantage of this arrangement is that the target object can be assisted more intelligently in completing the work.
For example, by analyzing historical behavior data, it is determined that the target user will be in a fatigue state when learning for about two hours, the learning duration of the target user may be determined according to the current behavior data, and then whether the target user is tired is determined according to whether the learning duration reaches two hours. If the learning duration reaches two hours, it may be determined that the target user is in a fatigue state, and of course, it may be further determined whether the target user is in a fatigue state in combination with other behavior data. If the learning duration does not reach two hours, it can be further determined whether the target user is in a distracted state in combination with other behavioral data.
The target behavior feature corresponding to the working state can be understood as a specific behavior feature expressed when the target user is in a certain working state. For example, the target user may have an eye rubbing action or yawning action several times over a period of time while in a fatigue state.
On the basis of the above technical solutions, optionally, the method for determining the interaction mode further includes: determining identity information of a target object, and determining historical program operation data corresponding to the identity information; correspondingly, determining the target application program to be run according to the second behavior data comprises the following steps: correspondingly, the determining the target application program to be run according to the second behavior data includes: and determining a target application program to be operated according to the historical program operation data and the second behavior data.
The historical program operation data refers to a target application program which is operated by a target object within a preset historical time period. Determining a target application program to be run according to the historical program running data and the second behavior data, specifically, obtaining a target object according to the historical program running data in a time period closest to the current time, and taking the historical application program as the target application program to be run based on the historical application program run by the second behavior data. That is, the application program that was run when the user made similar second behavior data last time is acquired as the target application program. Or acquiring the historical application program with the maximum operation times based on the second behavior data in the historical program operation data, and taking the historical application program as the target application program to be operated. That is, the application program most frequently used by the target object based on the second behavior data is taken as the recommended target application program.
Optionally, after determining the target application to be run according to the second behavior data, the method further includes: and running the target application program. In order to improve the interaction experience of the target object and meet the personalized requirements of the target object, before executing the target application program, the method may further query whether the target object is to run the target application program, and optionally after determining the target application program to be run according to the second behavior data, the method further includes: and generating inquiry information for inquiring whether the target object operates the target application program, responding to feedback information of the target object to the inquiry information, and determining whether the target application program operates according to the feedback information. The advantage of this arrangement is that the interaction process is made more human friendly and can respond more accurately to the needs of the target user.
According to the technical scheme, after the target interaction equipment corresponding to the target interaction mode is started, the second behavior data of the target object can be collected based on the target interaction equipment, so that the behavior intention of the user can be determined more accurately through the second behavior data, the target application program to be operated is determined according to the second behavior data, the accuracy of responding to the user requirement is improved, and the information interaction experience of the user is further improved.
EXAMPLE five
Fig. 5 is a flowchart illustrating a method for determining an interaction mode according to a fifth embodiment of the present disclosure. This embodiment may be combined with various alternatives of the above-described embodiments. In this embodiment, optionally, the method for determining the interaction manner further includes: acquiring environmental data of the data acquisition area; correspondingly, the determining a target interaction mode corresponding to the target object according to the interaction feature data includes: and determining a target interaction mode corresponding to the target object according to the environment data and the interaction feature data.
On the basis of the foregoing alternatives, optionally, the determining a target application to be run according to the second behavior data includes: and determining a target application program to be operated according to the environment data and the second behavior data.
As shown in fig. 5, the method for determining the interaction mode of the present embodiment may specifically include:
and S510, if the target object is located in the data acquisition area, acquiring first behavior data of the target object.
S520, determining the interactive feature data of the target object according to the first behavior data.
And S530, acquiring the environmental data of the data acquisition area.
In the embodiment of the disclosure, the situation that the interaction effect is possibly influenced by the environment information in the interaction scene is fully considered. Thus, environmental data of the data acquisition area can be acquired. In particular, environmental data of the data collection area may be collected based on an environmental collection device. Wherein, the environment acquisition device can be a shooting device and/or a sound acquisition device and the like. Alternatively, the camera may be a front camera or an overhead camera.
For example, the light intensity of the environment can be collected by the shooting device, so that the light or screen brightness can be automatically adjusted according to the collected light intensity of the environment. The top-set shooting device can be used for collecting image information of the working face, whether the working face is tidy or not is judged according to the judgment, the working face is reminded to arrange the working face to keep tidy and the like, the top-set shooting device can be used for collecting writing pictures and the like of the target user, the front-set shooting device is used for judging sitting postures, expressions, eye spirit and the like of the target object, whether the target object has bad behavior habits or not is judged according to the judgment, and for example, the bad sitting postures are reminded and corrected, the eye hygiene is corrected and the like. And environmental sound can be collected by a sound collection device.
And S540, determining a target interaction mode corresponding to the target object according to the environment data and the interaction feature data.
In the embodiment of the present disclosure, the current environment of the current object may be determined according to the acquired environment data, the behavior intention with the target object may be determined according to the interaction feature data, and a target interaction mode suitable for the target object in the current environment may be recommended. For example, when the environmental data shows that the target object is in a noisy environment, the prompt information may be generated and displayed. The prompt message can be displayed in various manners, for example, by enabling voice broadcasting, indication of an indicator light and/or display of a display screen. And determining a target interaction mode corresponding to the target object under the environment data according to the interaction feature data. It should be noted that the target interaction manner determined according to the environment data and the interaction feature data is a manner beneficial to improving the interaction accuracy, and the specific form thereof may be determined according to actual requirements, and is not specifically limited herein.
On the basis of the above optional technical solutions, optionally, after determining the target interaction mode corresponding to the target object according to the environment data and the interaction feature data, the method may further include: and starting the target interaction equipment corresponding to the target interaction mode. And then, second behavior data of the target object is collected based on the target interaction equipment, and a target application program to be operated is determined according to the environment data and the second behavior data. For example, if it is determined that a large noise exists in the current environment of the target object according to the environment data, a target application program which responds to the second behavior data of the target user in a text, picture or video manner may be selected and recommended to better adapt to the environment and improve the interaction experience of the user.
According to the technical scheme of the embodiment, the influence of the environmental factors on the interaction process is fully considered, the most appropriate target interaction mode for responding to the first behavior data in the current environment is determined by combining the acquired environmental data and the first behavior data, the interaction effect is effectively guaranteed, and the user experience is greatly improved.
EXAMPLE six
Fig. 6 is a schematic flowchart of a preferred example of a method for determining an interaction mode according to a sixth embodiment of the present disclosure, and referring to fig. 6, when a target object is located in a data acquisition area, an infrared sensor or the like may be used to determine whether the target object is recognized within a period of time, a microphone is used to acquire voice data, whether interaction with the target object occurs within a period of time is determined, a camera is used to acquire image data, whether the target object is recognized within a period of time is determined, and a touch screen is used to determine whether information interaction with the target object occurs within a period of time; to determine whether the target object is currently in front of the target interaction device. The determination may be made by one or more sensors interacting or recognizing a particular target object/body over a period of time (e.g., 5 seconds).
Specifically, the first behavior data of the target object may be acquired based on the one or more sensors, the camera, the microphone, the touch screen, and the like; the interactive feature data of the target object is determined according to the first behavior data, whether the target object has the representation of a specific behavior intention in the current scene is judged through the interactive feature data, and whether the first behavior data is a feature which accords with a certain behavior intention can also be judged through whether a historical application program is operated by the system within a period of time (such as one day, one hour, 10s and the like).
Alternatively, whether the target object at the current time is a history object may be determined by history data, that is, identity information of the user is determined. The historical record data may include historical behavior data of a historical object, or a target object representation, and the like, so as to determine a target interaction mode or a target application program more highly according to the identity information and the first behavior information and/or the second behavior information of the target object. The recommendation of the target interaction mode or the target application program comprises automatic starting, automatic identification and/or automatic running of the target application program and the like.
The method for determining the interaction mode provided by this embodiment may determine the interaction mode by using cross-judgment of various behavior data. For example, whether the target object is out of seat or in seat, whether the target object is in a working state or in a non-working state, the concentration degree, writing speed, fatigue degree and the like of the target user are judged, and the interaction mode is determined by combining various behavior data and specific scenes.
The working state of the target object can be evaluated, illustratively, the learning state of the target user is evaluated through the concentration degree or the fatigue degree of the target user, meanwhile, the efficiency management can be realized through intervening the target object, a closed-loop target plan and an incentive value are provided according to the real learning data effect, and the personalized recommendation of the interaction mode is realized according to the behavior data of the target object.
The technical scheme of the embodiment of the disclosure is particularly suitable for the situation that a target object is located in a data acquisition area but does not actively send out a clear interaction request, so that the initiative of the interaction device or the interaction system in an interaction scene is fully improved, the intelligent interaction between the interaction device or the interaction system and the target object is realized, and the technical problems of single information interaction mode, low information interaction efficiency and the like in the related technology are solved, so that the technical effects of enriching the information interaction mode, improving the information interaction efficiency and improving the information interaction experience of the target object are achieved.
EXAMPLE seven
Fig. 7 is a schematic structural diagram of an interactive manner determining device provided in a seventh embodiment of the present disclosure, and as shown in fig. 7, the interactive manner determining device in the seventh embodiment of the present disclosure includes: a first behavior data obtaining module 710, an interactive feature data determining module 720 and a target interaction mode determining module 730.
The first behavior data acquiring module 710 is configured to acquire first behavior data of a target object if the target object is located in a data acquisition area; an interactive feature data determining module 720, configured to determine interactive feature data of the target object according to the first behavior data; and a target interaction mode determining module 730, configured to determine a target interaction mode corresponding to the target object according to the interaction feature data.
On the basis of the above optional technical solutions, the determining device for the interaction manner may further include:
and the target interaction equipment starting module is used for starting the target interaction equipment corresponding to the target interaction mode.
On the basis of the above optional technical solutions, the target interaction device includes at least one of an infrared sensor, a camera, a control element, and a sound acquisition device; wherein the control element comprises a virtual control element and/or a physical control element.
On the basis of the above optional technical solutions, the device for determining an interaction manner may further include:
and the target application program recommending module is used for acquiring second behavior data of the target object based on the target interaction equipment after the target interaction equipment corresponding to the target interaction mode is started, and determining the target application program to be operated according to the second behavior data.
On the basis of the above optional technical solutions, the second behavior data includes preset behavior data; accordingly, the target application recommendation module comprises:
and the target application program execution unit is used for executing the target application program corresponding to the preset behavior data if the preset behavior data of the target object is acquired based on the target interaction equipment.
On the basis of the above optional technical solutions, the second behavior data further includes autonomous behavior data;
accordingly, the target application recommendation module further comprises:
and the target application program recommending unit is used for acquiring the autonomous behavior data of the target object and determining the target application program to be operated according to the autonomous behavior data if the preset behavior data of the target object is not acquired by the target interaction equipment within a preset time period.
On the basis of the above optional technical solutions, the target application recommending unit may be configured to: :
and determining the working state of the target object according to the autonomous behavior data, and determining the target application program to be operated based on the working state.
On the basis of the above optional technical solutions, the target object includes a target user; the autonomic behavior data comprises state information and/or action information of the target user, wherein the state information comprises facial expression information, eye information and/or posture information; the posture information includes a sitting posture and a standing posture.
On the basis of the above optional technical solutions, the device for determining an interaction manner may further include:
and the environment data acquisition module is used for acquiring the environment data of the data acquisition area.
Accordingly, the target interaction manner determining module may include:
and the first target interaction mode determining unit is used for determining a target interaction mode corresponding to the target object according to the environment data and the interaction characteristic data.
On the basis of the above optional technical solutions, the target application recommendation module may be configured to: and determining a target application program to be operated according to the environment data and the second behavior data.
On the basis of the above optional technical solutions, the target object includes a target user; the device for determining the interaction mode may further include:
the identity information determining module is used for determining the identity information of the target user;
accordingly, the interactive feature data determination module may be operable to:
and determining the interactive characteristic data of the target user according to the identity information and the first behavior data.
On the basis of the above optional technical solutions, the target interaction manner determining module may include:
determining identity information of the target object, and determining historical program operation data corresponding to the identity information;
accordingly, the target application recommendation module may be operable to:
and determining a target application program to be operated according to the historical program operation data and the second behavior data.
On the basis of the above optional technical solutions, the determining device for the interaction manner may further include:
after the target application program to be operated is determined according to the second behavior data, operating the target application program; alternatively, the first and second electrodes may be,
after the target application program to be operated is determined according to the second behavior data, inquiry information for inquiring whether the target object operates the target application program is generated, and whether the target application program is operated is determined according to feedback information of the target object to the inquiry information in response to the feedback information.
On the basis of the above optional technical solutions, the target interaction manner determining module may include:
and the first target interaction mode determining unit is used for determining a preset interaction mode corresponding to the target object according to the identity information and determining a target interaction mode in the preset interaction mode according to the first behavior data.
Optionally, the target object includes a target user; the identity information includes the age and/or gender of the target user.
According to the technical scheme of the embodiment of the disclosure, when the target object is located in the data acquisition area, the first behavior data of the target object is acquired, the first behavior data of the target object can be automatically acquired after the target object appears, the interaction feature data of the target object is further determined according to the first behavior data, the interaction intention of the target object is determined according to the first behavior data of the target object, and finally the target interaction mode corresponding to the target object is determined according to the interaction feature data, so that the method is particularly suitable for the situation that the target object is located in the data acquisition area but does not actively send a clear interaction request, the initiative of the interaction equipment or the interaction system in an interaction scene is fully improved, the intelligent interaction of the interaction equipment or the interaction system and the target object is realized, and the problem that the information interaction mode in the related technology is single is solved, the information interaction method has the advantages that the technical problems of low information interaction efficiency and the like are solved, so that the information interaction modes are enriched, the information interaction efficiency is improved, and the technical effects of the information interaction experience of the target object are improved.
The interactive mode determining device provided by the embodiment of the disclosure can execute the interactive mode determining method provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the embodiments of the present disclosure.
In addition, the embodiment of the disclosure also provides an information interaction system, which is realized based on the determination method of the interaction mode in the embodiment of the disclosure, and the system can comprise a data acquisition device, wherein the data acquisition device can be provided with at least one data acquisition module. Optionally, the target interaction device includes at least one of an infrared sensor, a camera, a control element, and a sound acquisition device; wherein the control element comprises a virtual control element and/or a physical control element. Illustratively, the data acquisition device may be integrated on the desk lamp. The camera may include a front camera and/or an overhead camera. The display device may be a display device having an input function, such as a touch display screen. The sound collection device may be a microphone or the like. The information interaction system provided by the embodiment of the disclosure can execute the method for determining the interaction mode provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
Example eight
Referring to fig. 8, a schematic structural diagram of an electronic device (e.g., a terminal device or a server) 800 suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 8, an electronic device 800 may include a processing means (e.g., central processing unit, graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage means 806 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing apparatus 801, the ROM802, and the RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage 806 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 8 illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 806, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
The electronic device provided by the embodiment of the present disclosure and the method for determining the interaction mode provided by the embodiment belong to the same public concept, and technical details that are not described in detail in the embodiment can be referred to the embodiment, and the embodiment has the same beneficial effects as the embodiment.
Example nine
The fifth embodiment of the present disclosure provides a computer storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for determining an interactive party provided by the foregoing embodiment is implemented.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
if the target object is located in the data acquisition area, acquiring first behavior data of the target object;
determining interactive feature data of the target object according to the first behavior data;
and determining a target interaction mode corresponding to the target object according to the interaction feature data.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The names of the units and modules do not limit the units and modules in some cases, and for example, the target video playing module may also be described as a "video playing module".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, [ example one ] there is provided a method of determining an interaction manner, the method including:
if the target object is located in the data acquisition area, acquiring first behavior data of the target object;
determining interactive feature data of the target object according to the first behavior data;
determining a target interaction mode corresponding to the target object according to the interaction feature data
According to one or more embodiments of the present disclosure, [ example two ] there is provided a method of determining an interaction manner, further comprising:
optionally, after the determining, according to the interaction feature data, a target interaction manner corresponding to the target object, the method further includes:
and starting the target interaction equipment corresponding to the target interaction mode.
According to one or more embodiments of the present disclosure, [ example three ] there is provided a method of determining an interaction manner, further comprising:
optionally, the target interaction device includes at least one of an infrared sensor, a camera, a control element, and a sound acquisition device; wherein the control element comprises a virtual control element and/or a physical control element.
Optionally, according to one or more embodiments of the present disclosure, [ example four ] provides a method for determining an interaction manner, further including:
optionally, after the target interaction device corresponding to the target interaction mode is started, the method further includes:
and acquiring second behavior data of the target object based on the target interaction equipment, and determining a target application program to be operated according to the second behavior data.
According to one or more embodiments of the present disclosure, [ example five ] there is provided a method of determining an interacting party, further comprising:
optionally, the second behavior data includes preset behavior data;
correspondingly, the acquiring second behavior data of the target object based on the target interaction device, and determining the target application program according to the second behavior data includes:
and if preset behavior data of the target object are acquired based on the target interaction equipment, executing a target application program corresponding to the preset behavior data.
According to one or more embodiments of the present disclosure, [ example six ] there is provided a method of determining an interacting party, further comprising:
optionally, the second behavior data further comprises autonomic behavior data;
correspondingly, the acquiring second behavior data of the target object based on the target interaction device, and determining the target application program according to the second behavior data includes:
if the preset behavior data of the target object is not acquired by the target interaction equipment within a preset time period, acquiring the autonomous behavior data of the target object, and determining a target application program to be operated according to the autonomous behavior data.
According to one or more embodiments of the present disclosure, [ example seven ] there is provided a method of determining an interaction manner, further comprising:
optionally, the determining a target application to be run according to the autonomic behavior data includes:
and determining the working state of the target object according to the autonomous behavior data, and determining a target application program to be operated based on the working state.
According to one or more embodiments of the present disclosure, [ example eight ] there is provided a method of determining an interaction manner, further comprising:
optionally, the target object comprises a target user; the autonomic behavior data comprises status information and/or action information of the target user, wherein the status information comprises facial expression information, eye information, and/or pose information; the posture information includes a sitting posture and a standing posture.
According to one or more embodiments of the present disclosure, [ example nine ] there is provided a method of determining an interaction manner, further comprising:
optionally, obtaining environmental data of the data acquisition area;
correspondingly, the determining a target interaction mode corresponding to the target object according to the interaction feature data includes:
and determining a target interaction mode corresponding to the target object according to the environment data and the interaction feature data.
According to one or more embodiments of the present disclosure, [ example ten ] there is provided a method of determining an interaction manner, further comprising:
optionally, the determining, according to the second behavior data, a target application to be run includes:
and determining a target application program to be operated according to the environment data and the second behavior data.
According to one or more embodiments of the present disclosure, [ example eleven ] there is provided a method of determining an interaction manner, further comprising:
optionally, the identity information of the target object is determined, and historical program operation data corresponding to the identity information is determined;
correspondingly, the determining the target application program to be run according to the second behavior data includes:
and determining a target application program to be operated according to the identity information and the second behavior data.
According to one or more embodiments of the present disclosure, [ example twelve ] there is provided a method of determining an interaction manner, further comprising:
optionally, after determining the target application to be run according to the second behavior data, the method further includes:
running the target application program; alternatively, the first and second electrodes may be,
and generating inquiry information for inquiring whether the target object operates the target application program, responding to feedback information of the target object to the inquiry information, and determining whether the target application program operates according to the feedback information.
According to one or more embodiments of the present disclosure, [ example thirteen ] provides a method of determining an interaction manner, further comprising:
optionally, the target object comprises a target user; the scheme also comprises the following steps:
determining identity information of the target user;
correspondingly, the determining the interaction feature data of the target object according to the first behavior data comprises:
and determining the interactive feature data of the target user according to the identity information and the first behavior data.
According to one or more embodiments of the present disclosure, [ example fourteen ] there is provided a method of determining an interaction manner, further comprising:
optionally, the determining a target interaction manner corresponding to the target object according to the interaction feature data includes:
and determining a preset interaction mode corresponding to the target object according to the identity information, and determining a target interaction mode in the preset interaction mode according to the first behavior data.
According to one or more embodiments of the present disclosure, [ example fifteen ] there is provided an interactive manner determining method, further comprising:
the target object comprises a target user; the identity information comprises the age and/or gender of the target user.
According to one or more embodiments of the present disclosure, [ example sixteen ] there is provided an interactive manner determination apparatus, including:
the first behavior data acquisition module is used for acquiring first behavior data of a target object if the target object is located in a data acquisition area;
the interactive feature data determining module is used for determining interactive feature data of the target object according to the first behavior data;
and the target interaction mode determining module is used for determining a target interaction mode corresponding to the target object according to the interaction characteristic data.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (18)

1. A method for determining an interaction mode is characterized by comprising the following steps:
if the target object is located in the data acquisition area, acquiring first behavior data of the target object;
determining interactive feature data of the target object according to the first behavior data;
and determining a target interaction mode corresponding to the target object according to the interaction feature data.
2. The method according to claim 1, further comprising, after determining a target interaction mode corresponding to the target object according to the interaction feature data:
and starting the target interaction equipment corresponding to the target interaction mode.
3. The method of claim 2, wherein the target interaction device comprises at least one of an infrared sensor, a camera, a control element, and a sound acquisition device; wherein the control element comprises a virtual control element and/or a physical control element.
4. The method according to claim 2, further comprising, after the starting of the target interaction device corresponding to the target interaction mode:
and acquiring second behavior data of the target object based on the target interaction equipment, and determining a target application program to be operated according to the second behavior data.
5. The method of claim 4, wherein the second behavior data comprises preset behavior data;
correspondingly, the acquiring second behavior data of the target object based on the target interaction device, and determining the target application program according to the second behavior data includes:
and if preset behavior data of the target object are acquired based on the target interaction equipment, executing a target application program corresponding to the preset behavior data.
6. The method of claim 4, wherein the second behavior data further comprises autonomic behavior data;
correspondingly, the acquiring second behavior data of the target object based on the target interaction device, and determining the target application program according to the second behavior data includes:
if the preset behavior data of the target object is not acquired by the target interaction equipment within a preset time period, acquiring the autonomous behavior data of the target object, and determining a target application program to be operated according to the autonomous behavior data.
7. The method of claim 6, wherein determining the target application to be run based on the autonomic behavior data comprises:
and determining the working state of the target object according to the autonomous behavior data, and determining a target application program to be operated based on the working state.
8. The method of claim 6 or 7, wherein the target object comprises a target user; the autonomic behavior data comprises status information and/or motion information of the target object, wherein the status information comprises facial expression information, eye information, and/or pose information; the posture information includes a sitting posture and a standing posture.
9. The method of claim 4, further comprising:
acquiring environmental data of the data acquisition area;
correspondingly, the determining a target interaction mode corresponding to the target object according to the interaction feature data includes:
and determining a target interaction mode corresponding to the target object according to the environment data and the interaction feature data.
10. The method of claim 9, wherein determining the target application to be run according to the second behavior data comprises:
and determining a target application program to be operated according to the environment data and the second behavior data.
11. The method of claim 4, further comprising:
determining identity information of the target object, and determining historical program operation data corresponding to the identity information;
correspondingly, the determining the target application program to be run according to the second behavior data includes:
and determining a target application program to be operated according to the historical program operation data and the second behavior data.
12. The method of claim 4, further comprising, after determining the target application to be run based on the second behavior data:
running the target application program; alternatively, the first and second electrodes may be,
and generating inquiry information for inquiring whether the target object operates the target application program, responding to feedback information of the target object to the inquiry information, and determining whether the target application program operates according to the feedback information.
13. The method of claim 1, wherein the target object comprises a target user; the method further comprises the following steps:
determining identity information of the target object;
correspondingly, the determining the interaction feature data of the target object according to the first behavior data comprises:
and determining the interactive feature data of the target object according to the identity information and the first behavior data.
14. The method according to claim 13, wherein the determining a target interaction mode corresponding to the target object according to the interaction feature data comprises:
and determining a preset interaction mode corresponding to the target object according to the identity information, and determining a target interaction mode in the preset interaction mode according to the first behavior data.
15. The method of claim 14, wherein the target object comprises a target user; the identity information includes the age and/or gender of the target subject.
16. An interactive mode determining apparatus, comprising:
the first behavior data acquisition module is used for acquiring first behavior data of a target object if the target object is located in a data acquisition area;
the interactive feature data determining module is used for determining interactive feature data of the target object according to the first behavior data;
and the target interaction mode determining module is used for determining a target interaction mode corresponding to the target object according to the interaction characteristic data.
17. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of interactive mode determination recited in any of claims 1-15.
18. A storage medium containing computer-executable instructions for performing the method of interactive mode determination of any of claims 1-15 when executed by a computer processor.
CN202011166110.7A 2020-10-27 2020-10-27 Method and device for determining interaction mode, electronic equipment and storage medium Pending CN112306238A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011166110.7A CN112306238A (en) 2020-10-27 2020-10-27 Method and device for determining interaction mode, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011166110.7A CN112306238A (en) 2020-10-27 2020-10-27 Method and device for determining interaction mode, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112306238A true CN112306238A (en) 2021-02-02

Family

ID=74331239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011166110.7A Pending CN112306238A (en) 2020-10-27 2020-10-27 Method and device for determining interaction mode, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112306238A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113160832A (en) * 2021-04-30 2021-07-23 合肥美菱物联科技有限公司 Voice washing machine intelligent control system and method supporting voiceprint recognition
CN114885092A (en) * 2022-03-23 2022-08-09 青岛海尔科技有限公司 Control method and device of image acquisition device, storage medium and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070168987A1 (en) * 2003-12-30 2007-07-19 Eric Vetillard Method for determining operational characteristics of a program
CN108196767A (en) * 2017-12-28 2018-06-22 广东小天才科技有限公司 Electronic equipment and use control method thereof
CN109697095A (en) * 2018-11-26 2019-04-30 量子云未来(北京)信息科技有限公司 A kind of method, apparatus and terminal device promoting user's sleep
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal
CN110109596A (en) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 Recommended method, device and the controller and medium of interactive mode
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium
CN111680177A (en) * 2020-06-01 2020-09-18 广东小天才科技有限公司 Data searching method, electronic device and computer-readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070168987A1 (en) * 2003-12-30 2007-07-19 Eric Vetillard Method for determining operational characteristics of a program
CN108196767A (en) * 2017-12-28 2018-06-22 广东小天才科技有限公司 Electronic equipment and use control method thereof
CN109697095A (en) * 2018-11-26 2019-04-30 量子云未来(北京)信息科技有限公司 A kind of method, apparatus and terminal device promoting user's sleep
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal
CN110109596A (en) * 2019-05-08 2019-08-09 芋头科技(杭州)有限公司 Recommended method, device and the controller and medium of interactive mode
CN111061953A (en) * 2019-12-18 2020-04-24 深圳市优必选科技股份有限公司 Intelligent terminal interaction method and device, terminal equipment and storage medium
CN111680177A (en) * 2020-06-01 2020-09-18 广东小天才科技有限公司 Data searching method, electronic device and computer-readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113160832A (en) * 2021-04-30 2021-07-23 合肥美菱物联科技有限公司 Voice washing machine intelligent control system and method supporting voiceprint recognition
CN114885092A (en) * 2022-03-23 2022-08-09 青岛海尔科技有限公司 Control method and device of image acquisition device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
KR102331049B1 (en) Leveraging user signals for initiating communications
CN107894833B (en) Multi-modal interaction processing method and system based on virtual human
CN105320726B (en) Reduce the demand to manual beginning/end point and triggering phrase
KR102327203B1 (en) Electronic apparatus and operation method of the same
CN107632706B (en) Application data processing method and system of multi-modal virtual human
CN109176535B (en) Interaction method and system based on intelligent robot
CN116484318B (en) Lecture training feedback method, lecture training feedback device and storage medium
CN112235635B (en) Animation display method, animation display device, electronic equipment and storage medium
KR20180109499A (en) Method and apparatus for providng response to user's voice input
JP2016062239A (en) Information processing device, information processing method and computer program
US20200005784A1 (en) Electronic device and operating method thereof for outputting response to user input, by using application
CN112541120B (en) Recommendation comment generation method, device, equipment and medium
CN112306238A (en) Method and device for determining interaction mode, electronic equipment and storage medium
CN111726691A (en) Video recommendation method and device, electronic equipment and computer-readable storage medium
CN113033245A (en) Function adjusting method and device, storage medium and electronic equipment
CN113821188A (en) Method and device for adjusting audio playing speed, electronic equipment and storage medium
CN113496156A (en) Emotion prediction method and equipment
US20200226012A1 (en) File system manipulation using machine learning
CN112306603A (en) Information prompting method and device, electronic equipment and storage medium
CN115328303A (en) User interaction method and device, electronic equipment and computer-readable storage medium
CN112306832A (en) User state response method and device, electronic equipment and storage medium
CN111309230B (en) Information display method and device, electronic equipment and computer readable storage medium
CN112041787A (en) Electronic device for outputting response to user input using application and method of operating the same
WO2020087534A1 (en) Generating response in conversation
CN112752159B (en) Interaction method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination