CN117130284A - Intelligent device control method and electronic device - Google Patents

Intelligent device control method and electronic device Download PDF

Info

Publication number
CN117130284A
CN117130284A CN202210546887.9A CN202210546887A CN117130284A CN 117130284 A CN117130284 A CN 117130284A CN 202210546887 A CN202210546887 A CN 202210546887A CN 117130284 A CN117130284 A CN 117130284A
Authority
CN
China
Prior art keywords
electronic device
target
intention
user
intent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210546887.9A
Other languages
Chinese (zh)
Inventor
曾立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210546887.9A priority Critical patent/CN117130284A/en
Priority to PCT/CN2023/094602 priority patent/WO2023221995A1/en
Publication of CN117130284A publication Critical patent/CN117130284A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an intelligent device control method and electronic equipment, and relates to the technical field of terminals. According to the method and the device for editing the scene, the scene can be edited with the intention as granularity, a plurality of devices corresponding to the intention are not required to be set one by one, the complexity of scene editing can be reduced, and the efficiency of device control is improved. The method may be applied to the first electronic device or to a component (such as a system-on-a-chip) supporting the first electronic device to implement the relevant function. The method comprises the following steps: displaying a first interface, the first interface comprising an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices; responsive to a user selecting a target intent from the one or more intents, adding the target intent to a target scene; when the trigger condition of the target scene is met, controlling one or more target electronic devices to execute the target intention; the one or more target electronic devices are electronic devices corresponding to the target intention.

Description

Intelligent device control method and electronic device
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an intelligent device control method and electronic equipment.
Background
Currently, users have more and more devices. For example, in a home scene, various devices in the home can be connected together through the internet of things technology to form an intelligent home system, so that centralized control of the devices is realized, and multiple functions such as home appliance control, lighting control and anti-theft alarm are provided for users.
However, because of the numerous devices, if a user needs to operate a plurality of devices, the user needs to switch different interfaces in the smart home application to find the plurality of devices to be controlled, which is complicated in operation and consumes time.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides an intelligent device control method and electronic equipment. According to the technical scheme provided by the embodiment of the application, the scene can be edited with the intention as granularity, a plurality of devices corresponding to the intention are not required to be set one by one, the complexity of scene editing can be reduced, and the efficiency of device control is improved.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
in a first aspect, an intelligent device control method is provided, which is applied to a first electronic device or a component (such as a chip system) supporting the first electronic device to implement related functions. The method comprises the following steps: displaying a first interface, the first interface comprising an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices; responsive to a user selecting a target intent from the one or more intents, adding the target intent to a target scene; and controlling one or more target electronic devices to execute the target intention when the trigger condition of the target scene is met. The one or more target electronic devices are electronic devices corresponding to the target intention.
In the scheme, in the current scene editing interface (first interface), a user can edit a scene with intention as granularity, a plurality of devices corresponding to the intention do not need to be set one by one, the complex operation caused by repeated interface switching is avoided, the time for scene editing is saved, and the efficiency of device control is improved.
Illustratively, as in (d) of fig. 7, the smart panel displays an interface 404 (first interface), the interface 404 including an identification of a plurality of intents (such as the intention card "light fully on", "light fully off", "combination light (brightness 60%)") in the area 404 a. In response to a user selecting a target intent (such as a combination light (brightness 60%), a temperature of 24 degrees, a curtain fully closed intent card) from among the plurality of intent cards in the region 404a, the intelligent panel adds the target intent to the viewing mode scene (i.e., the target scene).
Subsequently, upon satisfaction of a triggering condition of the target scene (such as condition 404j shown in fig. 7 (d)), the intelligent panel controls one or more target electronic devices to execute the target intent.
In one possible design, the target electronic device includes a second electronic device and a third electronic device, the controlling one or more target electronic devices to execute the target intent includes:
And sending a first control instruction to the second electronic device and sending a second control instruction to the third electronic device, so that the second electronic device and the third electronic device jointly execute the target intention.
By way of example, assuming that the combined light intention (60%) is included in the target scene, the intelligent panel transmits a first control instruction to the lamp 1 corresponding to the combined light intention, transmits a second control instruction to the lamp 2 corresponding to the combined light intention, and the lamps 1 and 2 collectively perform the combined light intention (brightness 60%), the total brightness of the lamps 1 and 2 being 60%.
In one possible design, the target intent includes a first target intent and a second target intent; the target electronic device comprises a second electronic device and a third electronic device;
controlling one or more target electronic devices to execute the target intent, including:
sending a first control instruction to the second electronic device, wherein the first control instruction is used for controlling the second electronic device to execute the first target intention;
and sending a second control instruction to the third electronic device, wherein the second control instruction is used for controlling the third electronic device to execute the second target intention.
Illustratively, as in (d) of fig. 7, the viewing mode scene (target scene) includes a combined lamp intent (first target intent) and temperature intent (second target intent). The intelligent panel sends a first control instruction to a lamp (second electronic equipment) in the living room, wherein the first control instruction is used for controlling the lamp in the living room to be turned on and adjusting the lamp to 60% brightness. The intelligent panel also transmits a second control instruction for controlling the execution of the temperature adjustment intention to, for example, an air conditioner (third electronic device) in the living room.
In one possible design, the first interface further includes an identification of a plurality of spaces of the whole house; the plurality of spaces including a first space and a second space; the identification of the first space is selected; the one or more intents include an intent executable by an electronic device in the first space.
For example, as in (a) of fig. 12A, the interface 404 may also include an identification of multiple spaces of a full house (e.g., an identification of a full house, a living room (one example of a first space), a restaurant (one example of a second space)). Wherein the identification of the living room is selected (e.g., displayed with black fill). The plurality of intents contained in the region 404a of the interface 404 includes intents executable by the electronic device in the living room.
In one possible design, the method further comprises:
and receiving the operation of the user on the identification of the second space, and displaying a second interface, wherein the second interface comprises the identification of the intention executable by the electronic equipment in the second space.
Illustratively, as in (b) of fig. 12A, the intelligent panel receives a user's operation of the identification of the restaurant, displays an interface 1101 (i.e., a second interface), and the interface 1101 includes an identification of an intention executable by the electronic device in the restaurant.
Thus, the user can conveniently switch the spaces and edit the scenes of the spaces.
In one possible design, the identity of the second space is selected; the one or more intents are a plurality of intents that also include intents executable by the electronic device in the second space.
For example, as shown in fig. 12B, in the interface 1203, the identification of the restaurant and the identification of the living room are selected, and the plurality of intents of the area 404a of the interface 1203 include: an intention that the electronic device can execute in the restaurant (such as an intention card (middle grade) of humidity) and an intention that the electronic device can execute in the living room.
Therefore, the intentions in the multiple spaces can be displayed through one interface, so that the multiple intentions crossing the spaces can be combined conveniently, a target scene is generated, the scene construction requirement of a user is met, and meanwhile, the tin-free device is edited and set for multiple devices one by one.
In one possible design, the identification of the intent executable by the electronic device in the first space has a different user interface UI effect than the identification of the intent executable by the electronic device in the second space.
For example, as shown in fig. 12B, the identification of the intention executable by the electronic device in the restaurant has a different user interface UI effect than the identification of the intention executable by the electronic device in the living room, such as an intention card (middle level) of humidity.
Therefore, the user can clearly distinguish the intentions belonging to different spaces, so that the user can conveniently and rapidly select the required intentions of the required space, and the required intentions are added into the target scene, thereby improving the efficiency of scene editing.
In one possible design, the control parameter corresponding to the target intention is a first control parameter;
the method further comprises the steps of: receiving a second operation input by a user;
responding to the second operation, displaying a first control, wherein the first control is used for inputting a second control parameter corresponding to the target intention;
and receiving a second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
For example, as in fig. 9 (b), assuming that the target intention is a combination lamp intention, the initial brightness (first control parameter) corresponding to the combination lamp intention is 30%, the smart panel receives an operation (second operation) that the user presses the combination lamp intention card 404k for a long time. In response to the long press operation, the smart panel displays a brightness adjustment bar 404m (first control). The intelligent panel receives the latest luminance (second control parameter) input by the user through the luminance adjustment bar 404m so that the luminance corresponding to the intention of the combination lamp is adjusted to the latest luminance (e.g., 60%).
Therefore, the user can adjust the control parameters corresponding to the target intention in the target scene, the equipment control requirement of the user can be met, and the flexibility of equipment control is improved.
In one possible design, the one or more intents pertain to one or more subsystems.
By way of example, the one or more intents may correspond to an illumination subsystem, a cool-warm fresh air subsystem, a sunshade subsystem, and the like. For example, when the one or more intents include full lamp on, full lamp off, turning on a combination lamp (brightness parameter 60%), the corresponding subsystem is the illumination subsystem. For another example, when the one or more intents include a full lamp on, a full lamp off, a combined lamp on (brightness parameter of 60%), a constant temperature intention, a constant humidity intention, the corresponding subsystems are an illumination subsystem and a cool-warm fresh air subsystem.
In a second aspect, an intelligent device control method is provided and applied to a first electronic device, and the method includes:
displaying a third interface, the third interface comprising an identification of one or more intents, at least one of the one or more intents corresponding to a plurality of electronic devices;
responsive to a user selecting a target intent from the one or more intents, displaying a fourth interface including the target intent and an identification of one or more electronic devices in a first group, the first group being associated with the user information; the one or more electronic devices are to perform the target intent, the one or more electronic devices including a target electronic device;
Receiving operation of a user on the identification of the target electronic equipment;
in response to the operation, instruct the target electronic device to perform the target intent, the user information including any one or more of: the location of the user, the behavior of the user, the current time.
Illustratively, as shown in fig. 16A (b), the intelligent panel displays an interface 1603 (third interface), the interface 1603 including identification of a plurality of intents. In response to a user's operation of selecting a target intention (intention of music playing) from among a plurality of intents, the intelligent panel may jump to an interface 1607 (fourth interface) corresponding to the music playing intention as shown in fig. 16B, the interface 1607 including the music playing intention (target intention) and an identification of one or more electronic devices in the living room group (such as an identification of the speaker 1 and an identification of the speaker 2).
Wherein the living room group is determined according to user information. The intelligent panel receives an operation of a user on an identification of a target electronic device in the living room group (such as an identification of the sound box 1), and in response to the operation, the intelligent panel instructs the sound box 1 to execute the music playing intention.
According to the scheme, the target electronic equipment group which the user wants to control can be determined according to the user information, the identification of the target electronic equipment group and the identification of each electronic equipment in the target electronic equipment group can be displayed, so that the user can conveniently and rapidly find the target electronic equipment which the user wants to control by browsing the identifications of each electronic equipment in the target electronic equipment group, the target electronic equipment is controlled, and the efficiency of equipment control is improved.
In one possible design, receiving a user operation for identification of the target electronic device includes:
an operation of a user moving an identification of the target electronic device toward an identification of the target intent is received.
In one possible design, the fourth interface further includes an identification of a second group including other electronic devices that may perform the target intent.
Illustratively, as in fig. 16B, interface 1607 includes, in addition to the identification of electronic devices in the living room (first group), identification of a second group of primary, secondary, study, etc. Thus, the user can find and know the equipment conditions of each space in the whole house conveniently.
In one possible design, the distance of the identity of the second group from the identity of the target intent is greater than the distance of the identity of the first group from the identity of the target intent. Illustratively, as shown in fig. 16B, the distance between the identification of the second group such as the study room and the identification of the center intention (music playing intention 163) is greater than the distance between the identification of the living room (i.e., the first group) and the identification of the center intention. Therefore, the distance between the identification of the living room and the identification of the center intention is relatively short, so that the identification of the electronic equipment in the living room can be operated quickly, and the corresponding electronic equipment can be controlled conveniently.
In one possible design, the fourth interface further includes a third control parameter of the target electronic device, the third control parameter being determined according to the user information.
For example, as shown in fig. 19 (b), the interface 1601 further includes a third control parameter such as a suggested brightness, a suggested color temperature, and the like.
In one possible design, the method further comprises:
and receiving a fourth control parameter corresponding to the target electronic equipment input by the user, so that the control parameter of the target electronic equipment is adjusted to the fourth control parameter. Illustratively, as in (c) of fig. 19, the intelligent panel may also receive a color temperature (e.g., 5500K) of the user input. Subsequently, the intelligent panel can control the lamp in the study to be turned on and adjust to the color temperature of 5500K.
In one possible design, the one or more intents pertain to one or more subsystems. For example, an illumination subsystem, a video subsystem.
In a third aspect, there is provided an intelligent device control method, including:
the first electronic device displays a first interface, the first interface including an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices;
The first electronic device adding the target intent to a target scene in response to a user selecting the target intent from the one or more intents;
when the first electronic device meets the triggering condition of the target scene, sending a control instruction to one or more target electronic devices, wherein the one or more target electronic devices are electronic devices corresponding to the target intention;
the target electronic device receives the control instruction from the first electronic device and executes the target intention according to the control instruction.
In one possible design, the target electronic device includes a second electronic device and a third electronic device;
the first electronic device sending control instructions to one or more target electronic devices, including: sending a first control instruction to the second electronic device and sending a second control instruction to the third electronic device, so that the second electronic device and the third electronic device jointly execute the target intention;
the target electronic device receiving the control instruction from the first electronic device and executing the target intention according to the control instruction, including: the second electronic device is used for receiving the first control instruction from the first electronic device and executing the target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the target intention according to the second control instruction.
In one possible design, the target intent includes a first target intent and a second target intent; the target electronic device comprises a second electronic device and a third electronic device;
the first electronic device sending control instructions to one or more target electronic devices, including:
sending a first control instruction to the second electronic device, wherein the first control instruction is used for controlling the second electronic device to execute the first target intention;
sending a second control instruction to the third electronic device, wherein the second control instruction is used for controlling the third electronic device to execute the second target intention;
the target electronic device receiving the control instruction from the first electronic device and executing the target intention according to the control instruction, including: the second electronic device is configured to receive the first control instruction from the first electronic device, and indicate the first target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the second target intention according to the second control instruction.
In one possible design, the first interface further includes an identification of a plurality of spaces of the whole house; the plurality of spaces including a first space and a second space; the identification of the first space is selected; the one or more intents include an intent executable by an electronic device in the first space.
In one possible design, the method further comprises: the first electronic device receives the operation of the user on the identification of the second space, and displays a second interface, wherein the second interface comprises the identification of the executable intention of the electronic device in the second space.
In one possible design, the identity of the second space is selected; the one or more intents are a plurality of intents that also include intents executable by the electronic device in the second space.
In one possible design, the identification of the intent executable by the electronic device in the first space has a different user interface UI effect than the identification of the intent executable by the electronic device in the second space.
In one possible design, the control parameter corresponding to the target intention is a first control parameter;
the method further comprises the steps of: the first electronic device receives a second operation input by a user; responding to the second operation, displaying a first control, wherein the first control is used for inputting a second control parameter corresponding to the target intention; and receiving a second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
In one possible design, the one or more intents pertain to one or more subsystems.
In a fourth aspect, there is provided a smart device control method, the method including:
the first electronic device displays a third interface, wherein the third interface comprises an identification of one or more intents, and at least one intention in the one or more intents corresponds to a plurality of electronic devices;
the first electronic device displaying a fourth interface in response to a user selecting a target intent from the one or more intents, the fourth interface including the target intent and an identification of one or more electronic devices in a first group, the first group being associated with the user information; the one or more electronic devices are to perform the target intent, the one or more electronic devices including a target electronic device;
the first electronic equipment receives the operation of the user on the identification of the target electronic equipment;
the first electronic device responds to the operation and sends a control instruction to the target electronic device, wherein the control instruction is used for instructing the target electronic device to execute the target intention, and the user information comprises any one or more of the following information: the location of the user, the behavior of the user, the current time.
The target electronic device receives a control instruction from the first electronic device and executes the target intention according to the control instruction.
In one possible design, receiving a user operation for identification of the target electronic device includes:
an operation of a user moving an identification of the target electronic device toward an identification of the target intent is received.
In one possible design, the fourth interface further includes an identification of a second group including other electronic devices that may perform the target intent.
In one possible design, the distance of the identity of the second group from the identity of the target intent is greater than the distance of the identity of the first group from the identity of the target intent.
In one possible design, the fourth interface further includes a third control parameter of the target electronic device, the third control parameter being determined according to the user information.
In one possible design, the method further comprises:
and receiving a fourth control parameter corresponding to the target electronic equipment input by the user, so that the control parameter of the target electronic equipment is adjusted to the fourth control parameter.
In a fifth aspect, there is provided a smart device control system, comprising:
a first electronic device for displaying a first interface, the first interface comprising an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices;
the first electronic device is further configured to add a target intent to a target scene in response to a user operation to select the target intent from the one or more intents;
the first electronic device is further configured to send a control instruction to one or more target electronic devices when a trigger condition of the target scene is satisfied, where the one or more target electronic devices are electronic devices corresponding to the target intention;
the target electronic device is configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction.
In one possible design, the target electronic device includes a second electronic device and a third electronic device;
the first electronic device is configured to send a control instruction to one or more target electronic devices, and includes: sending a first control instruction to the second electronic device and sending a second control instruction to the third electronic device, so that the second electronic device and the third electronic device jointly execute the target intention;
The target electronic device, configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction, includes: the second electronic device is used for receiving the first control instruction from the first electronic device and executing the target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the target intention according to the second control instruction.
In one possible design, the target intent includes a first target intent and a second target intent; the target electronic device comprises a second electronic device and a third electronic device;
the first electronic device is configured to send a control instruction to one or more target electronic devices, and includes:
sending a first control instruction to the second electronic device, wherein the first control instruction is used for controlling the second electronic device to execute the first target intention;
sending a second control instruction to the third electronic device, wherein the second control instruction is used for controlling the third electronic device to execute the second target intention;
the target electronic device, configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction, includes: the second electronic device is configured to receive the first control instruction from the first electronic device, and indicate the first target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the second target intention according to the second control instruction.
In one possible design, the first interface further includes an identification of a plurality of spaces of the whole house; the plurality of spaces including a first space and a second space; the identification of the first space is selected; the one or more intents include an intent executable by an electronic device in the first space.
In one possible design, the first electronic device is further configured to:
and receiving the operation of the user on the identification of the second space, and displaying a second interface, wherein the second interface comprises the identification of the intention executable by the electronic equipment in the second space.
In one possible design, the identity of the second space is selected; the one or more intents are a plurality of intents that also include intents executable by the electronic device in the second space.
In one possible design, the identification of the intent executable by the electronic device in the first space has a different user interface UI effect than the identification of the intent executable by the electronic device in the second space.
In one possible design, the control parameter corresponding to the target intention is a first control parameter;
the first electronic device is further configured to perform the following operations:
Receiving a second operation input by a user;
responding to the second operation, displaying a first control, wherein the first control is used for inputting a second control parameter corresponding to the target intention;
and receiving a second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
In one possible design, the one or more intents pertain to one or more subsystems.
In a sixth aspect, there is provided a smart device control system, comprising:
the first electronic device is used for displaying a third interface, the third interface comprises an identification of one or more intents, and at least one intention in the one or more intents corresponds to a plurality of electronic devices;
the first electronic device is further configured to display a fourth interface in response to a user selecting a target intent from the one or more intents, the fourth interface including the target intent and an identification of one or more electronic devices in a first group, the first group being associated with the user information; the one or more electronic devices are to perform the target intent, the one or more electronic devices including a target electronic device;
The first electronic device is further used for receiving the operation of the user on the identification of the target electronic device;
the first electronic device is further configured to send, in response to the operation, a control instruction to the target electronic device, where the control instruction is configured to instruct the target electronic device to execute the target intention, and the user information includes any one or more of the following information: the location of the user, the behavior of the user, the current time.
And the target electronic equipment is used for receiving the control instruction from the first electronic equipment and executing the target intention according to the control instruction.
In one possible design, receiving a user operation for identification of the target electronic device includes:
an operation of a user moving an identification of the target electronic device toward an identification of the target intent is received.
In one possible design, the fourth interface further includes an identification of a second group including other electronic devices that may perform the target intent.
In one possible design, the distance of the identity of the second group from the identity of the target intent is greater than the distance of the identity of the first group from the identity of the target intent.
In one possible design, the fourth interface further includes a third control parameter of the target electronic device, the third control parameter being determined according to the user information.
In one possible design, the first electronic device is further configured to:
and receiving a fourth control parameter corresponding to the target electronic equipment input by the user, so that the control parameter of the target electronic equipment is adjusted to the fourth control parameter.
In one possible design, the one or more intents pertain to one or more subsystems.
In a seventh aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing the method for controlling an intelligent device as described in any one of the above aspects and any one of possible implementation manners. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In an eighth aspect, a computer-readable storage medium is provided. The computer readable storage medium stores a computer program (which may also be referred to as instructions or code) which, when executed by an electronic device, causes the electronic device to perform the method of any one of the aspects or any one of the embodiments of any one of the aspects.
In a ninth aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of any one of the aspects or any one of the implementations of any one of the aspects.
In a tenth aspect, embodiments of the present application provide circuitry comprising processing circuitry configured to perform the method of any one of the aspects or any one of the implementation manners of any one of the aspects.
In an eleventh aspect, an embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiver function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs any one of the methods of any one of the aspects or any one of the implementation manners of any one of the aspects.
Drawings
FIG. 1 is a schematic diagram of a scene editing method in the related art;
FIG. 2 is a schematic diagram of a system architecture according to an embodiment of the present application;
FIG. 3A is a schematic diagram of a full house dividing subsystem according to an embodiment of the present application;
fig. 3B is a schematic diagram of a subsystem and atomic capability corresponding to an away-from-home scenario according to an embodiment of the present application;
Fig. 4 is a schematic hardware structure of a first electronic device according to an embodiment of the present application;
fig. 5 is a schematic software structure of a first electronic device according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a system architecture according to an embodiment of the present application;
FIGS. 7-9 are schematic diagrams illustrating interfaces provided by embodiments of the present application;
FIG. 10 is a schematic diagram of association of a scenario, a device, and a subsystem provided by an embodiment of the present application;
FIG. 11 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 12A is a schematic illustration of an interface provided by an embodiment of the present application;
FIG. 12B is a schematic illustration of an interface provided by an embodiment of the present application;
FIG. 12C is a schematic illustration of an interface provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of a view mode scene according to an embodiment of the present application;
fig. 14 and 15 are schematic views of interfaces according to embodiments of the present application;
FIG. 16A is a schematic illustration of an interface provided by an embodiment of the present application;
FIG. 16B is a schematic illustration of an interface provided by an embodiment of the present application;
FIGS. 17-22 are schematic diagrams illustrating interfaces provided by embodiments of the present application;
fig. 23 is a schematic structural diagram of a first electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. In the description of embodiments of the application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two).
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In some scenes, various electronic devices have entered the lives of people, and the concept of an intelligent home system is provided for the household use scene of the electronic devices, the house is taken as a platform, and the intelligent home system organically combines all electronic devices and applications related to the home life by utilizing technologies such as the Internet of things and automatic control. The electronic device in the smart home system is, for example, a smart home device. The intelligent household equipment is intelligent equipment, and comprises audio and video equipment (such as large screen equipment, bluetooth loudspeaker boxes and the like), lighting equipment (such as a ceiling lamp, a desk lamp, a spotlight and the like), environment control equipment (such as an air conditioner, an air purifier and the like), anti-theft alarm equipment (such as a human body sensor, a camera and the like) and the like.
For example, assuming that the air conditioner is used as a smart home device and is connected with the mobile phone, the air conditioner can receive a control command sent by a user through the mobile phone. For example, the air conditioner may be automatically started by receiving an "on" command input by a user through a mobile phone. For another example, the air conditioner receives a command of adjusting the temperature to 26 ℃ inputted by a user through a mobile phone, and can automatically adjust the temperature to 26 ℃. Optionally, an intelligent home application (such as a smart living application) is installed in the electronic device (such as a mobile phone), and the electronic device can be paired with the intelligent home device through the intelligent home application to manage and control the intelligent home device.
However, if the control of the smart home device is to be implemented, the electronic device needs to establish a connection with the smart home device in advance and configure the smart home device.
Illustratively, assuming that a smart life application is installed in a mobile phone, the mobile phone starts the smart life application, and a scene interface 201 as shown in (a) of fig. 1 is displayed. After detecting the operation of clicking the add control 21 by the user, the mobile phone displays an interface 202 as shown in fig. 1 (b), and receives the operation of creating a scene by the user, such as the operation of adding the condition for controlling the smart home device and the task required to be executed by the smart home device. For example, the user may click on the add condition control 26 and add a condition that controls the smart home device, such as adding a trigger condition "when clicking on a scene card" as shown in fig. 1 (f). In this way, subsequently, when the mobile phone detects that the user clicks the scene card of the scene, the smart home device can be controlled to execute the corresponding task.
For another example, as shown in fig. 1 (b), after detecting the operation of clicking the add task control 22 by the user, the mobile phone displays an interface 203 as shown in fig. 1 (c). On the interface 203, after detecting the operation of clicking the smart device control 23 by the mobile phone, determining that the user needs to add a task of controlling the smart device, the interface 204 shown in (d) of fig. 1 may be displayed, where the interface 204 includes a controllable smart device. Upon detecting that the user clicks the "living room air conditioner" option 27, the handset may jump to interface 205 as shown in fig. 1 (e). The interface 205 includes information to control the operations that can be performed. Assuming that the user is detected clicking on the "open" option 28, the handset may jump to the interface 206 shown in fig. 1 (f), the interface 206 including the control 24. Upon detecting a user clicking on control 24, the handset may jump to interface 203 shown in fig. 1 (c). Thereafter, the user may click on "Smart device" option 23 and the handset may jump to interface 204 shown in FIG. 1 (d), selecting the other device as the device to be controlled for the scene to be created.
It can be seen that, in the scene editing process as shown in fig. 1, the user needs to set the devices to be controlled in the scene respectively, and needs to repeatedly switch multiple interfaces, so that the scene editing process is complex, and the operation difficulty of the user is high.
In order to solve the technical problems, the embodiment of the application provides a control method of intelligent equipment, which aims to program granularity for scenes. Because an intention can be usually realized by one or more intelligent home devices, in some cases, the user performs scene arrangement based on the granularity of the intention, namely, the scene arrangement based on the granularity of the device is performed for a plurality of times, so that the scene arrangement operation of the user is greatly simplified, and the scene arrangement efficiency is improved. Illustratively, fig. 2 is a schematic diagram of a smart device control system to which the method is applicable. As shown in fig. 2, the smart device control system may manage smart devices in units of households (home). One home may be called a whole house, and the whole house may be divided into different spaces, for example, the whole house includes a passageway for entering a house, a kitchen, a restaurant, a living room, a balcony, a main sleeping room, a secondary sleeping room, a toilet, and the like.
The full-house system may include a first electronic device 100, which first electronic device 100 may be configured to control a second electronic device 200, such as an internet of things (internet of things, ioT) device. The first electronic device 100 includes, but is not limited to, a mobile phone, a PC, a tablet computer, a smart home control panel (which may be simply referred to as a smart panel), and the like.
As one possible implementation, the first electronic device 100 may be installed with an application for controlling the second electronic device 200. The application may be a system pre-installed application, or a non-pre-installed application (such as an application downloaded from an application market). It should be understood that: the system pre-installed application includes a portion of the system application (e.g., a service, component, or plug-in the system application) or a stand-alone application pre-installed within the first electronic device 100. It is understood that the independent application has independent application icons. The application may be, for example, a smart life application.
Alternatively, the first electronic device 100 may also control the second electronic device 200 through a control center. For example, the control center may be a shortcut control page displayed by the first electronic device 100 in response to a user's operation to slide down from the upper right corner or top of the screen.
Alternatively, the first electronic device 100 may also control the second electronic device 200 through a corresponding function menu in the negative one-screen. For example, the negative one screen may be a displayed system service capability entry page of the first electronic device 100 in response to a user's right-hand sliding operation on the leftmost main interface.
The embodiment of the present application does not limit the specific manner in which the first electronic device 100 is used to control the second electronic device 200.
The whole house is also provided with a second electronic device 200 (e.g., ioT device). The second electronic device 200 may also be referred to as a controlled device, and the second electronic device 200 may be controlled by the first electronic device 100. For example, the kitchen is provided with an electric cooker or an electric pressure cooker, a gas appliance and the like; the living room is provided with speakers (e.g., smart speakers), televisions (e.g., smart televisions, also known as smart screens, large screens, etc.), routing devices, etc.
It should be noted that, although in fig. 2, the second electronic device 200 only shows the smart television and the body fat scale, those skilled in the art should understand that the second electronic device 200 includes, but is not limited to, smart televisions, smart speakers, smart lamps (e.g., ceiling lamps, smart desk lamps, fragrance lamps, etc.), sweeping robots, smart clothes hangers, smart electric cookers, air purifiers, humidifiers, desktop computers, routing devices, smart sockets, water dispensers, smart refrigerators, smart air conditioners, smart switches, smart door locks, and other smart home devices. It should be noted that, the second electronic device 200 may be other types of devices, such as a Personal Computer (PC), a tablet computer, a mobile phone, a smart remote controller, etc., instead of the smart home device. The embodiment of the present application is not limited to the specific form of the second electronic device 200.
In some examples, one of the second electronic devices 200 may act as a master device for controlling the other second electronic devices 200.
In some embodiments, the second electronic device 200 may be of a wide variety, and the subsystems, such as the lighting subsystem, the environment subsystem, the security subsystem, etc., may be divided according to the functions of the second electronic device 200. Each subsystem may correspond to one or more intents. Each subsystem includes one or more devices.
For example, in a smart home application scenario, intended to express a user desire, may include, for example: on/off lamp, music playing, purifying closing, purifying opening, curtain full opening, constant temperature, constant humidity, etc.
By way of example, the lighting subsystem may include various types of lamps (including but not limited to ambient lamps), and the corresponding intent of the lighting subsystem may include: the lamp was fully on, the lamp was fully off, and the combination lamp was turned on (brightness parameter 60%).
Still further exemplary, the cooling and heating fresh air subsystem includes various devices capable of adjusting temperature and humidity, and the intention corresponding to the cooling and heating fresh air subsystem includes: constant temperature intention, constant humidity intention, constant net intention, etc.
Still further exemplary, sunshade subsystems include various types of devices that can achieve sunshade, and the intent that sunshade subsystem can achieve includes: the curtain is fully opened and the curtain is fully closed.
From the above description, it is understood that each intent may be implemented by one or more devices, or that each intent corresponds to one or more devices. Each device may implement one or more intents. For example, the intent to adjust the light may be achieved by lights and/or curtains. For the curtain, the purpose of adjusting light rays and the purpose of sun shading and sun protection can be realized.
Then, the first electronic device 100, after determining the user's intention and determining the second electronic device 200 that the user needs to control, may instruct the second electronic device 200 to perform the user's intention.
In the embodiment of the application, a plurality of devices in one or a plurality of subsystems can be freely combined and fused into a super terminal. Each device can become a mutual functional module, so that capability interaction and resource sharing are realized.
Optionally, the system may also include a hub device 300. Hub device 300 is also referred to as a hub, central control system, or host, etc. In some examples, hub device 300 may be used to divide a device of a full house into multiple subsystems, abstract the capabilities of the devices in the subsystems at subsystem granularity, form atomic capabilities of the subsystems, and adapt to different types of devices.
Hub device 300 may also generate a configuration file for the subsystem based on the atomic capabilities of the subsystem. Exemplary, as shown in fig. 3A, a full house includes subsystems including, but not limited to, the following: security protection, illumination, network, cool and warm fresh air, video and audio entertainment, furniture, water consumption, energy consumption, household appliances and sun shading.
As one possible implementation, in different scenarios, atomic capabilities of different subsystems need to be invoked in order to meet the device control needs of the user. Taking the away-from-home scenario as an example, as shown in fig. 3B, in some examples, a security subsystem, a lighting subsystem, and a cooling and heating fresh air subsystem need to be invoked in the away-from-home scenario. Taking the lighting subsystem as an example, the hub device 300 abstracts the capabilities of the devices in the lighting subsystem (including, but not limited to, the living room lights, floor lights, curtains shown in fig. 3B), forming one or more atomic capabilities (e.g., lighting modes, on-off control, precise dimming, etc.) of the lighting subsystem. One possible example of a configuration file for a lighting subsystem is shown in table 1 below. From table 1, the lighting subsystem may be abstracted into atomic capabilities of lighting patterns, switching control, precise dimming, etc. Optionally, the configuration file of the subsystem further includes an identification of the device included in the subsystem.
TABLE 1
It should be noted that, the atomic capability of a subsystem may be obtained by abstracting the capability of a part of devices in the subsystem, or may be obtained by abstracting the capability of all devices in the subsystem. An atomic capability of a subsystem corresponds to one or more devices in the subsystem. For example, the atomic capability of atmosphere light can be realized by a plurality of intelligent lamps, namely, the plurality of intelligent lamps can be utilized to jointly create a light and shadow effect with specific atmosphere. For another example, the ability to adjust brightness may be achieved by a plurality of intelligent lamps, wherein the brightness of intelligent lamp a is adjusted to brightness a, the brightness of intelligent lamp B is adjusted to brightness B, the brightness of intelligent lamp C is adjusted to brightness C, and intelligent lamps A, C, B together create a light and shadow scene with a brightness that is the target brightness.
In the embodiment of the application, one intention can correspond to one or more atomic abilities, one atomic ability can also correspond to one or more intents, and the embodiment of the application does not limit the specific corresponding relation between the intention and the atomic ability. For example, the intent may include: on/off lamp, music playing, constant temperature, constant humidity, purifying and closing, full opening and closing of curtain, etc. By way of example, the on-lamp intent may correspond to the atomic capabilities of sleep light, reading light, atmosphere light, zone lighting, full-on lamp, main lamp, etc., shown in Table 1.
After obtaining the atomic capabilities of each subsystem in the smart home system and the intents corresponding to the atomic capabilities, the first electronic device 100 may schedule or edit the scene with the intents executable by the subsystems, and no longer schedule the scene with the device. In one design, one intention can be executed by one or more intelligent home devices, so that in some cases, the user performs scene arrangement based on the intention granularity once, namely, the scene arrangement based on the device granularity is performed for a plurality of times, the scene arrangement operation of the user is greatly simplified, and the scene arrangement efficiency is improved. Alternatively, in other designs, a user may add multiple intents in a scene to be created, each of which may be performed by one or more smart home devices.
In some examples, the atomic capabilities of the subsystem may be updated on a periodic or preset basis. Illustratively, the illumination subsystem includes a reading light, a wall light strip, a ceiling light strip, and the like. Thereafter, the reading light is damaged, and upon detecting that the reading light is offline for a long period of time, the relevant atomic capabilities of the reading light may be removed from the lighting subsystem and the configuration file of the lighting subsystem updated.
Alternatively, the central devices of each room or each area and the central devices of the whole house may exist separately, or may be integrated with the second electronic device or the first electronic device into one device, or be devices in other forms. The application is not limited in this regard.
Optionally, in one example, the system may further include a server. The server may be used to maintain subsystem configuration files.
Optionally, in one example, the system further comprises a routing device (such as a router). Routing devices are used to connect to a local area network or the internet, using a specific protocol to select and set the path of the transmitted signal. The second electronic device 200 accesses the router, and performs data transmission with a device in the local area network or a device in the internet through a Wi-Fi channel established by the router. In one embodiment, hub device 300 may be integrated with a routing device as one device. For example, hub device 300 is integrated with a routing device as a routing device, i.e., the routing device has the functionality of hub device 300. The routing device may be one or more routing devices in the primary and secondary routing devices, or may be independent routing devices.
The above is merely an example of a system to which the device control method is applied, and more, fewer, or different device layout positions may be included in the system.
Alternatively, the first electronic device 100, the second electronic device 200, the server, and the hub device 300 in the embodiments of the present application may be implemented by different devices. For example, the server and the hub device in the embodiments of the present application may be implemented by the device in fig. 4. Fig. 4 is a schematic diagram of a hardware structure of a device according to an embodiment of the present application. The device comprises at least one processor 501, communication lines 502, a memory 503 and at least one communication interface 504. Wherein the memory 503 may also be included in the processor 501.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 501 may be a general purpose central processing unit (central processing unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program of the present application.
Communication line 502 may include a pathway to transfer information between the aforementioned components.
A communication interface 504 for communicating with other devices. In the embodiment of the present application, the communication interface may be a module, a circuit, a bus, an interface, a transceiver, or other devices capable of implementing a communication function, for communicating with other devices. Alternatively, when the communication interface is a transceiver, the transceiver may be a separately provided transmitter that is operable to transmit information to other devices, or a separately provided receiver that is operable to receive information from other devices. The transceiver may also be a component that integrates the functions of transmitting and receiving information, and embodiments of the present application are not limited to the specific implementation of the transceiver.
The memory 503 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, or an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store the desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication line 502. The memory may also be integrated with the processor.
Wherein the memory 503 is used to store computer-executable instructions for implementing the inventive arrangements and is controlled for execution by the processor 501. The processor 501 is configured to execute computer-executable instructions stored in the memory 503 to implement the methods provided by the embodiments of the present application described below.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application code, instructions, computer programs, or other names, and the embodiments of the present application are not limited in detail.
In a particular implementation, processor 501 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 4, as an embodiment.
In a particular implementation, as one embodiment, an electronic device may include multiple processors, such as processor 501 and processor 507 in FIG. 4. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
It will be appreciated that the configuration illustrated in fig. 4 does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The device shown in fig. 4 may be a general-purpose device or a special-purpose device, and embodiments of the present application are not limited in type of device. For example, the device is an intelligent panel and is a special device for controlling intelligent household equipment. For another example, the device is a mobile phone, and is a general device capable of controlling smart home devices.
Alternatively, the software system of the electronic device (such as the first electronic device, the second electronic device, the server, or the hub device) may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of an electronic device.
Taking the first electronic device as an example, fig. 5 is a software architecture block diagram of the first electronic device 100 according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of applications.
As shown in fig. 5, the applications may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
In the embodiment of the application, the application program also comprises an intelligent home management application and basic services. The basic service opens the management capability of the intelligent device to the system. The smart home management application may invoke the base service to query the smart home device to be controlled and/or invoke the base service to control the smart home device.
For example, the smart home management application may be a smart life application. The smart home application may also be other applications with similar functionality. The intelligent home management application can be a system original application or a third party application, and the embodiment of the application does not limit the category of the intelligent home management application. The following embodiments are mainly exemplified by smart life applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture. The telephony manager is for providing communication functions of the first electronic device 100. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal vibrates, and an indicator light blinks.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be noted that, the software architecture shown in fig. 5 is only one software architecture applicable to the electronic device, and the embodiment of the application does not limit the software architecture of the electronic device. Alternatively, some of the functional modules may be located in a different hierarchy than that shown in fig. 5. For example, the base service may also be provided in a framework layer, to which embodiments of the present application are not limited.
Fig. 6 shows a functional module division manner of each device in the system shown in fig. 2, and an interaction flow between devices. The first electronic device 100 includes a smart life application and a smart life basic service (such as HiLinkSVC). The smart life application provides a User Interface (UI) for full house intelligence. As one possible implementation, the smart life application invokes the smart life basic service through an interface provided by a software tool development Kit, such as the HiLink Kit SDK (software development Kit), to present various UIs through the smart life application. Alternatively, the UI includes a UI for intention management or a UI for scene management, or the like.
Alternatively, the UI may comprise an interface in the form of a card. Illustratively, the smart life application may provide an interface 402, as shown in FIG. 7, and the like. The user can control the intelligent household equipment through an interface provided by the intelligent life application.
As one possible implementation, the smart life foundation service may store profiles for each subsystem. As one possible implementation, the configuration file of the subsystem may be obtained from the first server.
The intelligent life basic service may include the following modules:
and the intention management module is used for managing the intention of the subsystem according to the configuration file of the subsystem. For example, the user may combine certain intents to form a new intent. For example, the user may combine the intentions of lighting, curtains, fresh air devices, etc., to newly generate a "comfort environment" intent. Further by way of example, the device control parameters corresponding to the intent may be adjusted. Further exemplary, intents may be deleted or added. Further exemplary, new intentions such as birthday party, warm home, dehumidification of the whole house, ventilation of the whole house, etc. can be generated.
And the scene management module is used for managing the scene. Scene management includes scene creation, modification of scenes, and the like.
Hub device 300 includes a device list management module, a blacklist management module, a fine tuning data management module, and a batch management module.
The device list management module is used for acquiring any one or more of the following information: a list of subsystems under Home/space/Room, a configuration file for each subsystem, a list of instantiation devices in each subsystem, and a list of intents each subsystem may implement.
And the blacklist management module is used for adding/deleting/modifying/inquiring blacklist equipment in each subsystem.
And the fine tuning data management module is used for adding/deleting/modifying/inquiring fine tuning parameters of the intention. By way of example, assuming that the intention is to turn on the combination lamp, the color temperature parameter of the combination lamp is 2000K, the user may modify the device control parameter corresponding to the intention, such as modifying the color temperature of the combination lamp to 3000K. The trimming data management module of the first electronic device may modify the intended trimming parameter (i.e. the color temperature) of turning on the combination lamp from 2000K to 3000K.
Wherein the combination lamp may be one or more lamps.
And the batch management module is used for dividing the equipment of the whole house into a plurality of subsystems and abstracting the capacity of the subsystems to obtain one or more atomic capacities which can be realized by the subsystems.
As one possible implementation, hub device 300 stores a list of all devices, a list of blacklisted devices, and the intended tuning parameters. Alternatively, hub device 300 may obtain a list of devices, a list of blacklisted devices, and the intended tuning parameters from the server.
It will be appreciated that the configuration illustrated in fig. 6 does not constitute a specific limitation on each electronic device. In other embodiments of the application, each electronic device may include more or fewer modules than shown, or some modules may be combined, some modules may be split, or different arrangements of modules may be used. The illustrated modules may be implemented in hardware, software, or a combination of software and hardware.
The method for controlling the smart device according to the embodiment of the present application will be described below mainly taking the first electronic device 100 as a control panel (may be simply referred to as an intelligent panel) of the smart device, and taking an application for managing smart home devices as a smart life application.
In some embodiments, after the electronic device accesses the local area network, it can discover other electronic devices that access the same local area network and/or log into the same account. For example, a user registers a smart life application, obtains a user name and password for an account. In the process of the new electronic equipment network distribution, the subsequent user can log in the account through other electronic equipment (such as a mobile phone) which is already distributed with the network to assist the new electronic equipment in network distribution. Then, the server divides the electronic devices under the same account into the same home to realize the management of the electronic devices by taking the home as a unit. Optionally, the server manages one or more home including one or more subsystems therein. Each subsystem includes one or more electronic devices.
Specifically, taking an intelligent panel as an example, after the intelligent panel logs in an intelligent life application, after detecting the operation of adding the electronic equipment by a user, sending equipment information of the newly added electronic equipment to a server, determining an ID (identity) of the electronic equipment by the server, and dividing the electronic equipment into a home corresponding to the current login account of the intelligent panel to complete the distribution network of the electronic equipment. Or, the intelligent panel responds to the user operation, the newly added device information of the electronic device is sent to the server, the server determines the ID of the electronic device, and the electronic device is divided into the home corresponding to the intelligent panel.
After adding the electronic device, the user may edit the scene through the electronic device. Illustratively, as shown in fig. 7 (a), the first electronic device displays an interface 401, the interface 401 including a scene option 401a. Upon detecting a user clicking on option 401a, the electronic device may jump to interface 402 as shown in fig. 7 (b). The interface 402 includes cards corresponding to the full house and cards corresponding to each space in the full house (e.g., card 402a corresponding to the living room). And controls for setting corresponding space scenes are also included on the cards corresponding to the spaces. Taking the card 402a corresponding to the living room as an example, a control 402b for setting a living room scene is arranged on the card 402 a.
It should be noted that, for the same intention, different presentation forms may be corresponding, for example, the intention card is displayed in different forms in the interface. Taking the lighting intention as an example, as one possible implementation, for the lighting intention, in some cases (such as in some fine lighting scenes or atmosphere lighting scenes) or in some spaces, the user adds the lighting intention to the target scene, possibly desiring to turn on all the lights in some space in the target scene, the interface board displays the intention of lighting in a "light full on" intention card. In other cases, the user adds a lighting intent to the target scene, perhaps desiring to turn on a light or lights in a space in the target scene, the interface panel may display an intent card for a particular light in the full-house space, e.g., may display a particular intent card for a wall door light strip, ceiling light strip, etc.
For example, as in (b) of fig. 7, considering that a user is eating at a restaurant, a eating atmosphere may be required, the intelligent panel displays the lighting intention in the restaurant in the form of a "light full on" intention card. The user may add the intent of "light fully on" to the dining scene. Therefore, when the triggering condition of the dining scene is met, the intelligent panel can control the lamp in the restaurant to be fully opened so as to meet the lighting requirement of the user in the dining scene.
As another example, as also shown in fig. 7 (b), on the card 402a corresponding to the living room, the intelligent panel displays the "ceiling light band" intention card. The intelligent panel may also display the intention cards of other lights in the living room. The user may add the intention cards of the plurality of lights in the living room to the target scene. Subsequently, when the triggering condition of the target scene is met, the intelligent panel can control a plurality of lamps in the living room to be started.
In some examples, upon detecting a user clicking on control 402b as shown in fig. 7 (b), the electronic device can jump to interface 403 as shown in fig. 7 (c).
Wherein interface 403 includes one or more scenes corresponding to a living room. Such as an enter living room scenario, an exit living room scenario, a reading scenario, a party mode scenario. In some examples, interface 403 also includes scene creation control 403a. After detecting the operation of clicking the control 403a by the user, the electronic device may provide an interface through which the user may input information such as a name of a scene to be created, a trigger condition to be created, and the like. After detecting that the user has performed the operation of inputting the name, trigger condition, etc. of the scene to be created, the electronic device may jump to the interface 404 as shown in fig. 7 (d), and the user may edit the scene to be created through the interface 404. Illustratively, as indicated by the 404i mark in interface 404, the trigger conditions for the viewing mode scene to be created are: the user clicks on the card.
Considering that in some scenes, the user does not clearly reach the optimal device required by the scene to be created, so in the embodiment of the application, when providing the scene editing function for the user, the scene editing function can be provided for the user with intention, the user does not need to select the devices required to be controlled in the scene to be created one by one, but can select the intention, and one or more optimal devices for completing the intention are accurately calculated by the system.
Wherein interface 404 includes a region 404a and a region 404b. The region 404b may be referred to as a scene editing region. The region 404a may be referred to as an intent display region, the region 404a including an identification of one or more intents corresponding to the plurality of subsystems. Each intent may be implemented by one or more devices. Alternatively, the identification of the intent may be presented in the form of a card, which may be referred to as an intent card, for representing the intent. Alternatively, the intelligent panel may query the configuration file of each subsystem, and determine the atomic capacity of each subsystem and the intent corresponding to each atomic capacity according to the configuration file. For example, the intelligent panel may query the configuration file of the illumination subsystem shown in table 1, determine the atomic capabilities of the illumination subsystem and the intents corresponding to the atomic capabilities according to the configuration file, and may display the intents corresponding to the illumination subsystem.
For example, as in fig. 7 (d), the following intention card is presented in the area 404a of the interface 404: the corresponding lamps of the illumination subsystem are fully on, the lamps are fully off, and the combined lamp is turned on (the brightness parameter is 60%); the constant temperature, constant humidity and purification corresponding to the cold and warm fresh air subsystem are closed; the window curtain corresponding to the sun-shading subsystem is fully opened and the window curtain is fully closed.
In response to a user selection operation of the intent card presented by the region 404a, the electronic device may add the user-selected intent card to the scene to be created. For example, as in (d) of fig. 7, in response to a user dragging the temperature adjustment card 404d from the area 404a to the area 404b, the electronic device adds the temperature adjustment card to the scene to be created, that is, the intention of temperature adjustment to the scene to be created. Alternatively, the operation of dragging the temperature adjustment card 404d from the area 404a to the area 404b may be a clicking operation of the temperature adjustment card 404d by the user, which is not limited in the present application. For another example, in response to a user dragging a combination light (60% intensity) card, a curtain full-close card from region 404a to region 404b, the electronic device adds a combination light intent (60% intensity) and a curtain full-close intent to the scene to be created. As shown in fig. 7 (d), the created viewing mode scene includes a combination lamp on intention (brightness parameter of 60%), a temperature adjustment intention (temperature adjustment to constant temperature of 24 degrees), and a curtain total off intention.
Similarly, the user may drag other intention cards, and the electronic device may add corresponding intents to the scene to be created according to the drag operation of the user.
Optionally, the interface 404 also includes controls for displaying the hidden intention cards in the corresponding subsystem. For example, as in fig. 7 (d), the interface 404 includes a control 404f, and in response to a user operation (e.g., a click operation) on the control 404f, the electronic device may display an interface 405 as shown in fig. 7 (e), with the area 404a of the interface 405 including other intention cards in the lighting subsystem, such as the intention card of a combination lamp (color temperature 5500 kelvin (K)), on. In some examples, a user may drag an intention card or the like with a combination light (color temperature 5500K) on from region 404a to region 404b, adding the intention with the combination light (color temperature 6500K) on into the viewing mode scene to be created.
Alternatively, the first electronic device may display the hidden intention card in the corresponding subsystem based on other operations of the user. For example, as shown in fig. 8 (a), when the first electronic device detects that the user performs a left-sliding operation near the tag 404g of the lighting subsystem of the interface 404, the electronic device may display the intention card 404h of the lighting subsystem that is turned off by the hidden reading light and the intention card 404i of the door light that is turned off, as shown in fig. 8 (b). The user may drag the intention card 404h of reading light off and the intention 404i of wall light band off from region 404a to region 404b, adding the intention of reading light off and the intention of wall light band off to the scene to be created. In this scheme, the user can create the target scene by simply selecting (e.g. dragging the intention card to the scene editing area 404 b) on the current interface, so that the problem that the user needs to repeatedly switch a plurality of interfaces to edit the scene and the scene editing efficiency is low in the scheme shown in fig. 1 is avoided.
When creating a scene, a user can combine multiple intentions to generate a scene which can meet the control requirement of equipment, for example, combining the intentions of starting a warm light, a curtain and fresh air equipment to generate a 'comfortable environment' scene.
In some embodiments, during the creation or editing of the scene, the first electronic device may receive a modification operation of the user on the intent to modify the intent. Illustratively, as in (a) of fig. 9, in response to a user dragging an intention card in the intention display area 404a (such as the intention card 404h with the reading light off, the intention card 404k with the combination light on and adjusting the brightness) to the scene editing area 404b, the intelligent panel adds these intents to the viewing mode scene. As in (b) of fig. 9, upon detecting the operation of the user pressing the intention card 404k for a long time, the electronic device may display the brightness adjustment bar 404m on the intention card 404 k. The user may drag the brightness adjustment bar 404m to adjust the brightness of the combination lamps (e.g., adjust the brightness of the combination lamps to 30%). Similarly, the electronic device can receive the operation of the user on other intention cards, and adjust the control parameters of the electronic device which can realize the corresponding intention.
Fig. 10 illustrates the relationship of a scene to a device or subsystem. As can be seen from fig. 10 (a), a scene created by a user is typically associated with multiple devices (e.g., 7 devices are associated with each scene on average). Therefore, in the scene editing scheme with granularity of devices, the user needs to switch to interfaces of a plurality of devices (such as interfaces of the air conditioner shown in fig. 1 and interfaces of other devices) respectively, and add the corresponding devices to one scene to be created. For example, if the user needs to adjust the brightness of a plurality of lamps in the living room when editing a scene, the user needs to edit the plurality of lamps one by adopting the scene editing method of the related art. In contrast, in the embodiment of the present application, as shown in fig. 10 (b), the device in the whole house is divided into a plurality of subsystems, 1.2 subsystems (or other number of subsystems) are associated with each scene on average, each subsystem includes one or more devices, and scene editing is performed with the intention of the subsystem as granularity. Where an intent is typically performed by one or more devices, in some cases, the user performs one scene edit based on the granularity of the intent, i.e., it is equivalent to performing multiple scene edits based on the granularity of the device. In this way, unified control of the devices across multiple spaces can be achieved without requiring separate orchestration for each device through multiple interfaces. For example, when a user edits a scene, the user only needs to drag the mark of the combined lamp intention (brightness parameter 60%) of the lighting subsystem to the scene editing area through the current interface, so that the scene editing can be completed. Therefore, the operation of the user in scene arrangement is simplified, and the scene arrangement efficiency can be improved.
Alternatively, as shown in fig. 7 (d), after detecting the operation of clicking the "save" button 404e by the user, the first electronic device may save the viewing mode scene created this time. As in (a) of fig. 11, the electronic device displays an interface 501, the interface 501 including a card 501a of a viewing mode scene created by the above-described process. Optionally, a control 501b for modifying the viewing mode scene may be provided on the card 501a.
In some examples, after the first electronic device creates the scene, the scene may be executed when the trigger condition is satisfied. For example, as shown in fig. 11 (a), if the operation of clicking the card 501a by the user is detected, the electronic device may execute each intention corresponding to the created viewing mode scene, that is, the electronic device indicates that one or more lamps capable of achieving 60% of the brightness parameter are turned on; the electronic equipment indicates the reading lamp, the wall door lamp belt and the ceiling lamp belt to be closed; the electronic equipment indicates all curtains in the living room to be closed; the electronic device instructs the device that can achieve the intent of temperature regulation (such as air conditioner a in living room) to adjust the temperature to 24 degrees.
In some examples, the first electronic device may also modify the created scene after creating the scene. Illustratively, as in fig. 11 (a), upon detecting a user's operation to click on a control 501b on a card 501a, the electronic device may pop up a pop-up window 502 as shown in fig. 11 (b). The popup 502 includes various intention cards included in the viewing mode scene. For example, upon detecting a long press of the combination lamp on-intention card 502a by the user, the electronic device may display the brightness adjustment bar 502b on the sign 502 a. The user may drag the brightness adjustment bar 502b to adjust the brightness of the combination lamp (e.g., adjust the brightness of the combination lamp to 30%). Similarly, the electronic device can receive the operation of the user on other intention cards, and adjust the control parameters of the electronic device which can realize the corresponding intention.
In some embodiments, the user may also add or delete intent for the created scene. Illustratively, as shown in FIG. 11 (b), the popup 502 also includes an intent display area 502d of the living room associated subsystem. The user can perform a sliding operation in the intent display area 502d in order to display the intent cards of the respective subsystems. As shown in fig. 11 (b), the intention display area 502 displays the intention cards of the illumination subsystem (combination lights, reading lights, wall door lights strips on). As shown in fig. 11 (c), in response to the user performing the slide-up operation in the intended display area 502d, the smart panel displays an intended card (temperature, humidity, purge) of the cool-warm fresh air subsystem in the intended display area 502d.
It should be understood that, as in fig. 11 (c), after detecting the operation of the user clicking the "save" button 502c, the electronic device saves the editing information of the movie mode scene this time.
The embodiment of the application also provides other implementation schemes of the scene editing interface. Alternatively, as in (a) of fig. 12A, the scene editing interface 404 may include a spatial identification column 1104. The space identification column 1104 includes an identification of a plurality of spaces of the whole house. Such as spatial identification of a whole house, living room, restaurant, etc.
In some examples, the spatial identification bar 1104 may be displayed when the intelligent panel opens the scene editing interface 404. Alternatively, in other examples, after the intelligent panel displays the scene editing interface 404, the spatial identification bar 1104 is displayed in response to a specific operation by the user. For example, after the intelligent panel displays the scene editing interface 404, the intelligent panel displays the space identification bar 1104 on the scene editing interface 404 in response to a user's right-slide operation starting at the left edge of the screen.
In some embodiments, after editing a scene of a current space (such as a scene of a living room shown in fig. 12A (a)), the first electronic device may switch to a scene editing interface of the target space in response to a user's operation to switch to the target space. Illustratively, as shown in fig. 12A (a), the intelligent panel displays a scene editing interface 404 of the living room. Then, as shown in fig. 12A (b), after detecting the operation of clicking the restaurant space tab 1102 by the user, the intelligent panel displays the scene editing interface 1101 of the restaurant. It can be seen that the user can control the intelligent panel to switch between the spaces by operating the space tag arranged in the current interface, so that the user can simply and rapidly enter the scene editing interface in other spaces, and the scene editing efficiency is higher.
Optionally, as shown in fig. 12A (a), a control 1103 may also be included in the spatial sidebar 1104. Control 1103 is used to add a spatial label (e.g., a spatial label for a study, a bathroom, etc.).
Fig. 12A is mainly described by taking an example that one interface presents a spatially corresponding intent identifier, and in other embodiments, multiple spatially corresponding intent identifiers may be presented on one interface. For example, as shown in fig. 12B, the smart panel display interface 1203, the interface 1203 includes a plurality of spatial labels (such as a label 1203a of a restaurant, a label 1203B of a living room). In response to the user selecting the tab 1203a and the tab 1203b, the smart panel displays an intention card corresponding to a restaurant and an intention card corresponding to a living room in the intention display area 404 b.
Alternatively, the UI effect of the intention card corresponding to the restaurant and the UI effect of the intention card corresponding to the living room may be different, so that the user can distinguish the intention cards in different spaces.
Illustratively, the UI effect of the intention card corresponding to the restaurant is similar to the UI effect of the spatial tag 1203a of the restaurant (for example, the background color of the spatial tag 1203a corresponding to the restaurant is red, and the background color of the intention card corresponding to the restaurant is also red), and the UI effect of the intention card corresponding to the living room is similar to the UI effect of the spatial tag 1203b of the living room.
Still further exemplary, as in FIG. 12C (a), the intelligent panel displays an interface 1201, the interface 1201 including a spatial identification column 1204. The spatial identification column 1204 may be different from the UI effect of each spatial identification in the spatial identification column 1104 shown in fig. 12A (a). For example, in fig. 12A (a), the shape of the spatial mark of the living room or the like is a circle, and in fig. 12C (a), the shape of the spatial mark of the living room is a rectangle. In some examples, the spatial identification may be displayed together when the interface 1201 is displayed. Alternatively, it may be that after the interface 1201 is displayed, the smart panel displays a space identification column 1204 on the interface 1201 in response to a specific operation by the user (such as a right-slide operation at the left edge of the screen).
Wherein the spatial identification bar includes a control 1201a. In response to a user clicking on control 1201a, the smart panel may pop-up popup 1201b. After detecting the operation of the user clicking on the "restaurant" option, as in (b) of fig. 12C, the intelligent panel may display a spatial label 1201C of the restaurant in the interface 1201. In some examples, where restaurant identification 1201c and living room identification are selected simultaneously, the intelligent panel may display in interface 1201 the restaurant's corresponding intention card (e.g., card 1201 d) and the living room's corresponding intention card.
Optionally, in the scene editing interface, UI effects of the intention identifiers corresponding to different spaces may be different, so that the user can identify the space to which the corresponding intention belongs. For example, in the interface 1201 shown in fig. 12C (b), the frame of the intention card corresponding to the living room is displayed normally, and the frame of the intention card corresponding to the restaurant is displayed boldly.
In some examples, after creating the viewing mode scene shown in fig. 12C (b), the user may adjust parameters of the viewing mode scene. Illustratively, upon receiving a user operation for adjusting a scene parameter, the intelligent panel may pop up a pop-up window 1202 as shown in fig. 12C (C), and the user may adjust the scene parameter (e.g., the brightness of the combination lamp) through the pop-up window 1202.
It should be appreciated that after the first electronic device creates the target scene, the first electronic device detects whether a trigger condition for executing the target scene is satisfied, and when the trigger condition is satisfied, the first electronic device executes the target scene. For example, taking a trigger condition of executing the viewing mode scene as an example of clicking the card corresponding to the viewing mode scene, when the intelligent panel detects that the user clicks on the card 501a as shown in fig. 11 (a), the intelligent panel may instruct the device of the one or more spaces associated with the viewing mode scene to execute a corresponding operation.
Specifically, if the intelligent panel acquires the intention corresponding to the viewing mode scene, and as shown in fig. 7 (d), when the user creates the viewing mode scene, the intelligent panel determines that the intention corresponding to the viewing mode scene is that: turning on the combined lamp of the living room, adjusting the brightness of the combined lamp to 60%, turning off the reading lamp of the living room, turning off the wall lamp belt of the living room, turning off the ceiling lamp belt of the living room, adjusting the temperature of the living room to 24 degrees (target temperature), and turning off all curtains of the living room. Then, the intelligent panel may determine a target device that may implement the one or more target intents according to the one or more target intents corresponding to the viewing mode scene, and instruct the target device to perform a corresponding operation.
As one possible implementation, the intelligent panel may obtain parameters of a plurality of lights in the living room and determine one or more target lights that may achieve 60% brightness based on the parameters of the plurality of lights. Alternatively, the target lamp may be one or more lamps that achieve 60% brightness and best energy saving. Alternatively, the target lamp may be one or more lamps that achieve 60% brightness and a color temperature best suited for viewing the movie. The target lamp may also be another type of lamp. Illustratively, as shown in fig. 13, the target lamps that the intelligent panel needs to control to determine the viewing mode scene are lamps 1-3.
Further exemplary, the user may also combine multiple intents to create new scenes of birthday parties, warmth home, dehumidify the whole house, ventilate the whole house, etc.
For example, the intelligent panel may acquire parameters of a device (which may be simply referred to as a temperature control device) that may implement a temperature control function in the living room, and determine a target temperature control device for adjusting the temperature of the living room to a target temperature from among the plurality of temperature control devices. Alternatively, the target temperature control device may be an electric heater or an air conditioner or other temperature control device.
Similarly, the intelligent panel may also determine other target devices for achieving target intent. The foregoing list only exemplary methods of determining a target device, and it should be appreciated that the first electronic device (e.g., a smart panel) may also determine a target device that may achieve a target intent in other ways.
Fig. 14 shows another example of a scene editing interface provided by an embodiment of the present application. Optionally, the user may switch to the scene editing interface of the corresponding space through the space tag. As shown in fig. 14 (a), the intelligent panel displays a scene editing interface 1401 of a whole house. The scene editing interface 1401 includes a tag 1401a (which may be abbreviated as a scene tag) of each scene. As in fig. 14 (a), scene tags include, but are not limited to, tags for away-home scenes, tags for return-home scenes, tags for sleep scenes, tags for reading scenes, and the like. The user may select a tab of a scene to edit the scene. Alternatively, when a certain scene tab is selected, the intelligent panel may display the intention near the center of the display screen, and at this time, the intention displayed near the center of the display screen may be referred to as a center intention. For example, in the interface 1401, a tab of the away-from-home scene is selected, the intelligent panel displays a center intention (away-from-home) near the center of the display screen, and the user can edit the away-home scene through the interface 1401.
Optionally, the interface 1401 also includes a control 1401b, the control 1401b being used to add a scene tag.
In some embodiments, interface 1401 also includes an identification of each subsystem of the full house. Such as an identification 1401c that includes a security subsystem. For another example, the system also comprises an identification of the video subsystem, an identification of the lighting subsystem and an identification of the cooling and heating fresh air subsystem. The user may select an identification of the target subsystem from the identifications of the subsystems, and the intelligent panel may add an intention achievable by the target subsystem to the away-from-home scenario. Illustratively, as in fig. 14 (a), the user drags the identity 1401c of the security subsystem in the direction of the center intent (away from home), as in fig. 14 (b), after detecting that the identity 1401c of the security subsystem collides with the center intent (away from home), referred to as a collision operation, or is close (e.g., the distance is less than a preset threshold), the intelligent panel determines an intention that the user wants to add the security subsystem to the away-from-home scene.
In some examples, the intelligent panel adds intent to the away-from-home scenario that the security subsystem defaults to being achievable.
In other examples, the smart panel may provide an interface through which a user may adjust the intent achievable by the security subsystem. Illustratively, after detecting a collision of the logo 1401c as shown in fig. 14 (a) with the center intention (away from home), the smart panel pops up the popup window 1401d as shown in fig. 14 (c). The pop-up window 1401d includes one or more intended cards (e.g., smart door lock closed card, infrared curtain fully open card) that may be implemented by the security subsystem.
In one implementation, a user may add corresponding intents to a target scene by selecting one or more cards of intents.
In another implementation, the card of one or more intents is active by default, i.e., the intent is added to the target scene by default without the user removing the intent. Alternatively, the user may delete a portion of the intent, or the user may invalidate some of the intentions in the away-from-home scenario by some operations.
For example, as shown in fig. 14 (c), if the smart panel detects an operation of the user's long-pressing (or double-clicking or other operation) the intention 1401e, the intention 1401e is deactivated, i.e., the intention 1401e is not added in the away-from-home scene, but a card of the intention 1401e may be left in the pop-up window 1401 d. Optionally, the deactivated intended UI effect is different from the activated state intended UI effect. For example, as in fig. 14 (c), the intent 1401e in the deactivated state is displayed with a bolded UI effect or the intent in the deactivated state is displayed in gray (not shown in the figure).
As further illustrated, as shown in fig. 14 (d), after the intelligent panel detects an operation of the user's long-pressing (or double-clicking or other operation) of the intention 1401f, the intelligent panel displays a prompt box 1401g for prompting the user whether to remove the selected intention. Upon detecting the user clicking on the "yes" option, the smart panel removes the intent 1401f from the pop-up 1401d, i.e. no longer displays the intent 1401f on the pop-up 1401 d.
In some examples, as shown in fig. 14 (d), the pop-up window 1401d further includes a control 1401h, the control 1401h for adding an intention to a target scene (such as an away-from-home scene). Similar to the solution of the previous embodiment, the intended parameters may also be adjusted in the corresponding solution of fig. 14. For specific ways of adjusting the intent parameters, reference may be made to the related descriptions of the previous embodiments, and details are not repeated here.
In other embodiments, a plurality of subsystems may be added in the away-from-home scene, for example, as shown in fig. 14 (b), the user may drag the identifier 1401i of the lighting subsystem and the identifier 1401j of the cooling and heating fresh air subsystem to collide with or approach the identifier 1401k of the away-from-home scene, and then the intelligent panel determines that the user wants to add the intention corresponding to the lighting subsystem and the intention corresponding to the cooling and heating fresh air subsystem in the away-from-home scene.
In some embodiments, the intelligent panel may recommend to the user, for each scene, an intent to be added to the scene. Taking the example of creating a viewing mode scene by a user, for example, the intelligent panel may determine the general purpose contained in the viewing mode scene based on big data analysis or by adopting other methods, and display a card of the general purpose in the region 404a of the scene editing interface 406 shown in fig. 15 according to the general purpose in the viewing mode scene, for example, the card of the general purpose in the viewing mode scene includes: an intention card of a combination lamp (brightness 60%), an intention card of temperature, an intention card of curtain full-close. In this way, by recommending the cards of the common intents in the corresponding scenes to the user, the user can be effectively guided to add the common intents in the scenes, so that the scenes can be created efficiently.
Optionally, the scene editing interface 406 may further include a control 404f, and in response to a user clicking on the control 404f, the intelligent panel may display other intention cards corresponding to the illumination subsystem. The user may select a portion of the intention cards from the other intention cards and add the selected intention cards to the scene to be created.
The embodiment of the application also provides an intelligent device control method, wherein the first electronic device can determine a group (which can be called a target electronic device group) of the second electronic device which the user desires to control according to the user information and the user intention, and can display the identification of the target electronic device group and the identification of each electronic device which can realize the target intention (such as the lighting intention) in the target electronic device group. For example, the target group of electronic devices may be a living room group of electronic devices, a home group of electronic devices, or the like. The identification of the target group of electronic devices may be living room, home, etc.
Illustratively, assuming the user is located in a home, as in fig. 7 (a), the intelligent panel displays an interface 401, the interface 401 including a full house control option 401b. Upon receiving the user clicking on the full house control option 401b, the intelligent panel may jump to interface 1602 shown in fig. 16A (a). As in (a) of fig. 16A, an intention identification column 161 is displayed in the interface 1602, the intention identification column 161 being for displaying an intention identification of an intention. When the on-lamp intention identifier 162 in the intention identifier column is displayed in the selected state, the interface 1602 currently displayed by the intelligent panel is a control interface of the on-lamp intention corresponding to the on-lamp intention identifier 162. The interface 1602 includes an identification of the electronic device in the primary sleeper that may achieve the intent to turn on the light (e.g., an identification of a dome light, a spot light, a desk lamp, a floor light displayed in the interface 1602 as shown in fig. 16A (a)). The user can operate the identification of the electronic device in the main sleeper to control the corresponding electronic device. Illustratively, the user moves the identity of the spotlight into collision with the identity of the central intent (turn-on light), and in response to this operation by the user, the intelligent panel controls the spotlight to turn on. Still further exemplary, the user moves both the identification of the spotlight and the identification of the floor light to a center intent, and in response to the user's operation, the intelligent panel controls the spotlight and the floor light to be turned on.
Still further exemplary, the user moves the identification of the primary sleeper to collide with the identification of the center intention (on-lamp), and in response to the operation of the user, the intelligent panel determines the target electronic device in the primary sleeper that the user wants to control, and controls the target electronic device to perform the on-lamp intention. For example, in response to a user moving the identity of the primary sleeper to collide with the identity of the center intention, the intelligent panel defaults to control all devices in the primary sleeper that can achieve the lighting intention, and then the intelligent panel controls all devices in the primary sleeper that can achieve the lighting intention (e.g., dome lights, spotlight, desk lights, floor lights) to perform the lighting intention. For another example, in response to the user moving the primary lying logo to collide with the center intended logo, the intelligent panel determines, from the user information, that the current time period is 8:00 (typically, the time to get home from work), the intelligent panel determines that the user wants to control the dome lights in the primary bedroom to turn on.
Optionally, the interface 1602 may also include an identification of other groups of electronic devices (e.g., identification of living room, next-to-lying, study) that may implement the intent to turn on the light. Optionally, assuming that the distance between the identifier of the other electronic device group that can realize the lighting intention and the center intention is a first distance, the distance between the identifier of the electronic device group that the user wants to control (may be referred to as a target electronic device group) and the center intention is a second distance, and the first distance is greater than the second distance. That is, the identification of the target electronic device group is closer to the center intention, so that the user can drag the identification of the target electronic device in the target electronic device group to collide with the center intention, and the user can conveniently control the target electronic device. For example, as shown in fig. 16A, the distance between the main lying mark and the center intention (on-lamp intention) is closer than the mark of the sub lying group, etc., so that the dragging distance for the user to perform the collision operation can be shortened, the time for controlling the device can be reduced, and the efficiency of controlling the device can be improved.
For another example, as shown in fig. 16A (b), when the user is in the living room, and the user opens the control interface corresponding to the lighting intention, the intelligent panel may determine that the user wants to control the electronic device in the living room according to the location of the user, and then the intelligent panel displays the interface 1603, where the interface 1603 includes the identification of the electronic device in the living room that can implement the lighting intention, such as the identification of the dome lamp, the spotlight, the desk lamp, and the ground lamp in the interface 1603 shown in fig. 16A (b).
For example, when the current time period is a typical midday break time period of the user, and the user opens the control interface corresponding to the curtain closing intention, the intelligent panel can determine that the user wants to control the electronic device in the main sleeping position according to the current time, then the marks of the curtains capable of realizing the curtain closing intention in the main sleeping position are displayed on the control interface, and the user can conveniently control the curtain in the main sleeping position by operating the marks of the curtains in the main sleeping position. Optionally, the intelligent panel may also display on the control interface the identification of other device groups that may achieve the curtain closing intent.
In some embodiments, the user may switch intent. For example, as in fig. 16A (b), the smart panel displays an interface 1603 corresponding to the lighting intention 162. Subsequently, as shown in fig. 16B, after detecting that the user clicks the identification of the music playing intention in the intention identification field 1607, the intelligent panel displays an interface 1607 corresponding to the music playing intention.
Considering that in some cases, besides the target electronic device group which is determined by the intelligent panel according to the user information, the user wants to control devices in other electronic device groups. As one possible implementation, in response to a user's operation of the identification of the group of electronic devices, the intelligent panel may display the identification of the electronic devices in the group of electronic devices, through which the user may control the electronic devices. For example, as shown in fig. 16A (a), if the user wants to control a dome lamp or the like in the main sleeping position, the user may drag the identifier of the corresponding electronic device in the main sleeping position (such as the identifier of the dome lamp or the identifier of the spotlight) to collide with the central intention (on-lamp), or drag the identifier of the corresponding electronic device in the main sleeping position to a position with a distance from the central intention being less than a certain threshold value, so as to trigger the intelligent panel to control the corresponding electronic device to execute the on-lamp intention.
If the user further wants to control the electronic device in the secondary lying position, the user can further press the secondary lying position 169 for a long time, and in response to the user pressing the secondary lying position 169 for a long time, the intelligent panel can display a popup window 170 shown in fig. 17, wherein the popup window 170 comprises the identification (such as the identification of a desk lamp and a ceiling lamp) of each electronic device capable of realizing the lighting intention in the secondary lying position. Responsive to a user operation of the identification of the secondary lying electronic device, the intelligent panel controls the corresponding electronic device to execute the lighting intention. For example, in response to a user clicking the operation of the secondary lying desk lamp and the dome lamp identifier, the intelligent panel controls the secondary lying desk lamp and the dome lamp to be turned on.
The above description mainly uses the intelligent panel to determine the target electronic device group, and then displays the identification of the target electronic device group and the identifications of the electronic devices in the target electronic device group to the user on the device control interface. In some embodiments, after determining the target group of electronic devices, the intelligent panel may display the identification of the target group of electronic devices, but not the identification of each electronic device in the target group of electronic devices.
For example, as shown in fig. 18 (a), when the user is in the home position and the user opens the control interface corresponding to the intention of turning on the light, the intelligent panel may determine that the user wants to control the electronic device in the home position according to the position of the user, and the interface 1605 is displayed on the intelligent panel, and the interface 1605 includes the identifier 164 of the home position.
Optionally, the interface 1605 may further include: the identification of other electronic equipment groups with the intention of turning on the light can be realized, such as the identification of living rooms, secondary lying rooms and study rooms. Optionally, the distance between the primary lying identity and the central intention is smaller than the distance between the other group identities and the central intention. Thus, the user can drag the main lying mark to collide with the central intention conveniently.
Optionally, in response to the user moving the primary lying identifier to an operation of collision with the central intention, the intelligent panel automatically determines a target electronic device in the primary lying which the user wants to control, which can realize the lighting intention. For example, the intelligent panel determines that the user wants to control all devices in the primary sleeper that may achieve the intent of turning on the lights. For another example, the intelligent panel determines, according to the user information, that the device that can realize the intention of turning on the light in the primary sleeper that the user wants to control is a spotlight.
Or in response to the operation that the user moves the main sleeping identifier to collide with the central intention identifier, the intelligent panel pops up a popup window, and the popup window comprises equipment capable of realizing the intention of turning on a lamp in the main sleeping. In response to a user selecting a target electronic device from among devices for which the lighting intention is achievable in the main sleeper, the intelligent panel determines the target electronic device that the user wants to control. Alternatively, the intelligent panel may employ other methods to determine the target electronic device that the user wants to control.
As another example, as shown in fig. 18 (b), when the user is located in the living room, and the user opens the control interface corresponding to the lighting intention, the intelligent panel may determine that the user wants to control the electronic device in the living room according to the location where the user is located, and then the intelligent panel displays the interface 1606, where the interface 1606 includes the identifier 172 of the living room. Optionally, the distance between the living room identification and the center intention is smaller than the distance between the other group identifications and the center intention. Thus, the user can drag the living room identification to collide with the central intention conveniently.
In some embodiments, the intelligent panel may determine control parameters for a target electronic device in the group of electronic devices.
Illustratively, as shown in fig. 19 (a), the intelligent panel displays an interface 401, the interface 401 including a full house control option 401b. Upon receiving the user clicking on the full house control option 401b, the intelligent panel may jump to the interface 1601 shown in fig. 19 (b), where the intent identification bar 161 is displayed on the interface 1601, where the intent identification bar 161 is used to display the intent identification of the intent.
As shown in fig. 19 (b), the on-lamp intention flag 162 displayed in the intention flag column 161 is highlighted, and the interface 1601 currently displayed on the intelligent panel is a control interface for the on-lamp intention corresponding to the on-lamp intention flag 162. As indicated by reference numeral 163, the center intent of the smart panel currently being displayed is the on-light intent. For example, the user may click on the identification 162 of the lighting intent, and in response to the user's operation, the intelligent panel may display the lighting intent near the center of the screen (referred to as center intent 163).
As one possible implementation, the intelligent panel may display a group of electronic devices that can achieve the intent to turn on light around a center intent (intent to turn on light). As indicated by reference numeral 164, the main sleeper contains electronics that can achieve the intent of turning on the lights, and thus the intelligent panel displays an identification of the main sleeper group (as indicated by the circle indicated by reference numeral 164 displayed on the interface 1601). The intelligent panel may also display other groups of electronic devices (e.g., groups of electronic devices in living rooms, next-to-lying rooms) that may be used to achieve the intent of turning on the lights on the interface 1601.
As a possible implementation manner, the intelligent panel may acquire user information that may be authorized to obtain, determine control parameters of devices in the corresponding group according to the user information, and may prompt the user for the corresponding control parameters. Optionally, the user information includes, but is not limited to, one or more of the following: user location, time, user behavior.
Still further exemplary, the intelligent panel learns that the user will typically read or learn in the study at 8:30-10:00 pm based on historically obtained user information. Then, if the current time is 9:00 pm, the intelligent panel may display on the interface 1601 recommended control parameters for the devices in each respective group, when the user opens the interface 1601, the recommended control parameters being suitable for reading or learning scenarios. For example, the intelligent panel (b) of fig. 19 displays, on the interface 1601, a study mark as indicated by reference numeral 165, a reading advice luminance as indicated by reference numeral 166 of 30%, and a reading advice color temperature as indicated by reference numeral 167 of 5000K.
Alternatively, the user may adjust the suggested control parameters displayed by the smart panel. Illustratively, as shown in fig. 19 (c), the user can press the card of the recommended color temperature shown by the mark 167 for a long time to adjust to a new color temperature 5500K. Thereafter, as shown in fig. 19 (d), the user can drag the logo 165 of the study in the direction of the center intention (turn on the light). After detecting that the logo 165 of the study collides with the center intention, the intelligent panel may send a control instruction to a target device in the study that can achieve the lighting intention (brightness is 30%, color temperature is 5500K), and the target device executes the lighting intention. Wherein the target device may be one or more. Illustratively, the target devices are lamp 1 and lamp 2, the combined brightness of lamp 1 and lamp 2 is 30%, and the combined color temperature is 5500K. Alternatively, the target device is the lamp 3, the luminance of the lamp 3 is 30% of the recommended luminance, and the color temperature is 5500K of the recommended color temperature. In this scheme, the intelligent panel can recommend device control parameters suitable for the current scene (such as reading or learning scene) and capable of realizing target intention to the user according to the user information. For example, the brightness parameter and the color temperature parameter of the device which is suitable for reading scenes and can realize the intention of turning on the lamp can be recommended to the user. It can be seen that the intelligent panel can provide personalized device control related guidance comments for the user so as to effectively guide the user to control the intelligent device.
As another example, as shown in fig. 20 (a), the intelligent panel display interface 1604, the interface 1604 includes an electronic device (such as a spotlight, a dome lamp, etc.) that can realize the lighting intention in the main sleeping, and a recommended luminance (30%) and a recommended color temperature (5500K) corresponding to the lighting intention. As shown in fig. 20 (b), in response to the user's intention to drag the spot light mark toward the center, the intelligent panel controls the spot light in the main sleeping position to be turned on, the brightness of the spot light is 30%, and the color temperature is 5500K.
Alternatively, the suggested control parameters corresponding to different groups of electronic devices may be presented as different UI effects. For example, as shown in fig. 21, a card 166 of suggested brightness of an electronic device (e.g., floor lamp, reading lamp) that can realize the lighting intention in the study room, a card 167 of suggested color temperature is presented with a first UI effect (e.g., upper right to lower left slash), a card of suggested brightness of an electronic device (e.g., floor lamp) that can realize the lighting intention in the living room, and a card of suggested color temperature is presented with a second UI effect (e.g., upper left to lower right slash). For example, the user may drag the logo of the study in the direction of the center intent (on light). After detecting that the logo of the study collides with the center intention, the intelligent panel can send a control instruction to target equipment which can realize the lighting intention (the brightness is 30 percent and the color temperature is 5000K) in the study, and the target equipment executes the lighting intention. For another example, the user may drag the logo of the living room in the direction of the center intention (turn on the light). After detecting that the sign of the living room collides with the central intention, the intelligent panel can send a control instruction to target equipment which can realize the lighting intention (the brightness is 35 percent, the color temperature is 5500K) in the living room, and the target equipment executes the lighting intention.
In the above-mentioned example, the intelligent panel determines the recommended control parameters of the corresponding electronic device according to the user information, and in other embodiments, the intelligent panel may also determine the recommended control parameters of the corresponding electronic device according to other information.
In some embodiments, the user may also select multiple intents, which the intelligent panel may combine. For example, as shown in fig. 22, the operation of the user selecting the lighting intention 162 and the music playing intention from the intention identification column 161 is detected, and the intelligent panel displays the center intention (lighting, music playing) near the center of the screen. In response to a user dragging the primary lying identifier to collide with the central intention, or the user dragging the primary lying identifier to a distance smaller than a certain distance from the central intention, the intelligent panel can control a lamp in the primary lying to be turned on, and control equipment such as a sound box in the primary lying to execute music playing.
Optionally, the intelligent panel may further determine, according to the user information, a lighting intention and a device control parameter corresponding to the music playing intention. For example, if the current time is 10:00 pm, the intelligent panel determines that the proposed volume corresponding to the music playing intention is xx decibels (dB), and displays a card of the proposed volume as indicated by the mark 168. Therefore, the related equipment for controlling the music playing intention by the user can be effectively guided, so that the nearby neighbors are prevented from being disturbed in the music playing process.
The above-mentioned one or more interfaces are exemplary, and other interface design manners are also possible, and the present application is not limited to the specific design manner of the interfaces, and the switching manner between the interfaces is not limited.
The layout of the scene editing interface may be a trigger condition for displaying the scene in the right side column of the scene editing interface, an intention card of a subsystem in the lower area of the scene editing interface, and the like.
And, the operation of the control in the interface is only an example, and the user can trigger the first electronic device to execute the target operation by performing a certain operation on the target control in the interface. Other operations may also be defined, which the first electronic device performs when it is detected that the user performs on the target control.
The above description is mainly given by taking the spatial division group as an example, and in other embodiments, the group may be divided according to other division standards. For example, the groups may be partitioned by a distance between the device and the user, and the devices having a distance from the user less than a first threshold may be partitioned into the first group. The intelligent panel may display an identification of each device in the first group for a user to control devices that are closer together.
It should be noted that some operations in the flow of the above-described method embodiments are optionally combined, and/or the order of some operations is optionally changed. The order of execution of the steps in each flow is merely exemplary, and is not limited to the order of execution of the steps, and other orders of execution may be used between the steps. And is not intended to suggest that the order of execution is the only order in which the operations may be performed. Those of ordinary skill in the art will recognize a variety of ways to reorder the operations described herein. In addition, it should be noted that details of processes involved in a certain embodiment herein apply to other embodiments as well in a similar manner, or that different embodiments may be used in combination.
Moreover, some steps in method embodiments may be equivalently replaced with other possible steps. Alternatively, some steps in method embodiments may be optional and may be deleted in some usage scenarios. Alternatively, other possible steps may be added to the method embodiments.
Moreover, the method embodiments described above may be implemented alone or in combination.
For example, the intelligent panel may determine a group of electronic devices that the user wants to control based on the user information, and may determine suggested control parameters for a target electronic device in the group of electronic devices based on the user information.
It will be appreciated that, in order to implement the above-mentioned functions, the device in the embodiment of the present application includes corresponding hardware structures and/or software modules for performing the respective functions. The various illustrative units and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present application.
The embodiment of the application can divide the functional units of the electronic device according to the method example, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 23 shows a schematic block diagram of a first electronic device provided in an embodiment of the present application, where the information transmission apparatus may be the receiving device or the transmitting device described above. The first electronic device 3300 may exist in the form of software, or may be a chip that may be used for the device. The first electronic device 3300 includes: a processing unit 3303, a transmitting and receiving unit 3302, and a display unit 3301. Alternatively, the transceiving unit 3302 may be further divided into a transmitting unit (not shown in fig. 23) and a receiving unit (not shown in fig. 23). The sending unit is configured to support the first electronic device 3300 to send information to other devices. A receiving unit, configured to support the first electronic device 3300 to receive information from other devices. A display unit 3301 for supporting display contents.
Optionally, the first electronic device 3300 may further include a storage unit 1701 (not shown in the figure) for storing program codes and data of the first electronic device 3300, and the data may include, but is not limited to, raw data or intermediate data, and the like.
In one possible approach, the processing unit 3303 may be a controller or the processor 501 and/or the processor 507 shown in fig. 4, such as a central processing unit (Central Processing Unit, CPU), general purpose processor, digital signal processing (Digital Signal Processing, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, a combination of a DSP and a microprocessor, and so forth.
In a possible manner, the transceiver unit 3302 may be the communication interface 504 shown in fig. 4, and may also be a transceiver circuit, a transceiver, a radio frequency device, or the like.
In one possible approach, the memory unit 1701 may be the memory 503 shown in fig. 4.
In one possible approach, the display unit 3301 may include a display screen.
The embodiment of the application also provides electronic equipment which comprises one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the relevant method steps described above to implement the information transmission method of the above-described embodiments.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to implement the method of any of the method embodiments described above.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integral with the processor or separate from the processor, and the application is not limited. The memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor are not particularly limited in the present application.
The system-on-chip may be, for example, a field programmable gate array (field programmable gatearray, FPGA), an application specific integrated chip (application specific integrated circuit, ASIC), a system on chip (SoC), a central processing unit (central processorunit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
The embodiment of the application also provides a computer readable storage medium, wherein computer instructions are stored in the computer readable storage medium, and when the computer instructions run on the electronic device, the electronic device is caused to execute the related method steps to realize the information transmission method in the embodiment.
The embodiment of the present application also provides a computer program product, which when run on a computer causes the computer to perform the above-mentioned related steps to implement the information transmission method in the above-mentioned embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be a component or module in particular, which may include a processor and a memory connected; the memory is configured to store computer-executable instructions, and when the apparatus is running, the processor may execute the computer-executable instructions stored in the memory, so that the apparatus performs the information transmission method in each of the method embodiments described above.
The electronic device, the computer readable storage medium, the computer program product or the chip provided by the embodiments of the present application are used to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated that in order to achieve the above-described functionality, the electronic device comprises corresponding hardware and/or software modules that perform the respective functionality. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method may be implemented in other manners. For example, the above-described embodiments of the terminal device are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via interfaces, modules or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in whole or in part in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) or a processor to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic or optical disk, and the like.
The foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (21)

1. An intelligent device control method, which is applied to a first electronic device, comprises the following steps:
displaying a first interface, the first interface comprising an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices;
responsive to a user selecting a target intent from the one or more intents, adding the target intent to a target scene;
when the trigger condition of the target scene is met, controlling one or more target electronic devices to execute the target intention; the one or more target electronic devices are electronic devices corresponding to the target intention.
2. The method of claim 1, wherein the target electronic device comprises a second electronic device and a third electronic device, the controlling one or more target electronic devices to perform the target intent comprising:
And sending a first control instruction to the second electronic device and sending a second control instruction to the third electronic device, so that the second electronic device and the third electronic device jointly execute the target intention.
3. The method of claim 1, wherein the target intent comprises a first target intent and a second target intent; the target electronic device comprises a second electronic device and a third electronic device;
the controlling one or more target electronic devices to execute the target intent includes:
sending a first control instruction to the second electronic device, wherein the first control instruction is used for controlling the second electronic device to execute the first target intention;
and sending a second control instruction to the third electronic device, wherein the second control instruction is used for controlling the third electronic device to execute the second target intention.
4. A method according to any one of claims 1-3, wherein the first interface further comprises an identification of a plurality of spaces of a full house; the plurality of spaces including a first space and a second space; the identification of the first space is selected; the one or more intents include an intent executable by an electronic device in the first space.
5. The method according to claim 4, wherein the method further comprises:
and receiving the operation of the user on the identification of the second space, and displaying a second interface, wherein the second interface comprises the identification of the intention executable by the electronic equipment in the second space.
6. The method of claim 4, wherein the identity of the second space is selected; the one or more intents are a plurality of intents that also include intents executable by the electronic device in the second space.
7. The method of claim 6, wherein the identification of the intent executable by the electronic device in the first space has a different user interface UI effect than the identification of the intent executable by the electronic device in the second space.
8. The method according to any one of claims 1-7, wherein the control parameter to which the target intention corresponds is a first control parameter; the method further comprises the steps of:
receiving a second operation input by a user;
responding to the second operation, displaying a first control, wherein the first control is used for inputting a second control parameter corresponding to the target intention;
and receiving a second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
9. The method of any one of claims 1-8, wherein the one or more intents belong to one or more subsystems.
10. An intelligent device control system, comprising:
a first electronic device for displaying a first interface, the first interface comprising an identification of one or more intents; at least one intent of the one or more intents corresponds to a plurality of electronic devices;
the first electronic device is further configured to add a target intent to a target scene in response to a user operation to select the target intent from the one or more intents;
the first electronic device is further configured to send a control instruction to one or more target electronic devices when a trigger condition of the target scene is satisfied; the one or more target electronic devices are electronic devices corresponding to the target intention;
the target electronic device is configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction.
11. The system of claim 10, wherein the target electronic device comprises a second electronic device and a third electronic device;
The first electronic device is configured to send a control instruction to one or more target electronic devices, and includes: sending a first control instruction to the second electronic equipment and sending a second control instruction to the third electronic equipment;
the target electronic device, configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction, includes: the second electronic device is used for receiving the first control instruction from the first electronic device and executing the target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the target intention according to the second control instruction.
12. The system of claim 10, wherein the target intent comprises a first target intent and a second target intent; the target electronic device comprises a second electronic device and a third electronic device;
the first electronic device is configured to send a control instruction to one or more target electronic devices, and includes:
sending a first control instruction to the second electronic device, wherein the first control instruction is used for controlling the second electronic device to execute the first target intention;
Sending a second control instruction to the third electronic device, wherein the second control instruction is used for controlling the third electronic device to execute the second target intention;
the target electronic device, configured to receive the control instruction from the first electronic device, and execute the target intention according to the control instruction, includes: the second electronic device is configured to receive the first control instruction from the first electronic device, and indicate the first target intention according to the first control instruction; the third electronic device is configured to receive the second control instruction from the first electronic device, and execute the second target intention according to the second control instruction.
13. The system of any of claims 10-12, wherein the first interface further comprises an identification of a plurality of spaces of a full house; the plurality of spaces including a first space and a second space; the identification of the first space is selected; the one or more intents include an intent executable by an electronic device in the first space.
14. The system of claim 13, wherein the first electronic device is further configured to:
And receiving the operation of the user on the identification of the second space, and displaying a second interface, wherein the second interface comprises the identification of the intention executable by the electronic equipment in the second space.
15. The system of claim 13, wherein the identity of the second space is selected; the one or more intents are a plurality of intents that also include intents executable by the electronic device in the second space.
16. The system of claim 15, wherein the identification of the intent executable by the electronic device in the first space has a different user interface UI effect than the identification of the intent executable by the electronic device in the second space.
17. The system according to any one of claims 10-16, wherein the control parameter to which the target intention corresponds is a first control parameter;
the first electronic device is further configured to perform the following operations:
receiving a second operation input by a user;
responding to the second operation, displaying a first control, wherein the first control is used for inputting a second control parameter corresponding to the target intention;
and receiving a second control parameter input by the user through the first control, so that the control parameter corresponding to the target intention is adjusted to the second control parameter.
18. The system of any one of claims 10-17, wherein the one or more intents belong to one or more subsystems.
19. An electronic device, comprising: a processor, a memory and a display screen, the memory and the display screen being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the electronic device to perform the method of any of claims 1-9.
20. A computer readable storage medium, characterized in that the computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the method according to any one of claims 1-9.
21. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1-9.
CN202210546887.9A 2022-05-19 2022-05-19 Intelligent device control method and electronic device Pending CN117130284A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210546887.9A CN117130284A (en) 2022-05-19 2022-05-19 Intelligent device control method and electronic device
PCT/CN2023/094602 WO2023221995A1 (en) 2022-05-19 2023-05-16 Intelligent device control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210546887.9A CN117130284A (en) 2022-05-19 2022-05-19 Intelligent device control method and electronic device

Publications (1)

Publication Number Publication Date
CN117130284A true CN117130284A (en) 2023-11-28

Family

ID=88834670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210546887.9A Pending CN117130284A (en) 2022-05-19 2022-05-19 Intelligent device control method and electronic device

Country Status (2)

Country Link
CN (1) CN117130284A (en)
WO (1) WO2023221995A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460328B (en) * 2014-10-29 2019-05-10 小米科技有限责任公司 Smart machine control method and device based on set scene mode
CN105652671B (en) * 2015-12-25 2019-10-22 小米科技有限责任公司 The setting method and device of smart machine operating mode
US10719200B2 (en) * 2016-02-18 2020-07-21 Sure Universal Ltd. Architecture for remote control of IOT (internet of things) devices
CN106873551B (en) * 2016-11-30 2020-08-18 芜湖美智空调设备有限公司 Linkage method and system among different household appliances
WO2020228032A1 (en) * 2019-05-16 2020-11-19 深圳市欢太科技有限公司 Scene pushing method, apparatus and system, and electronic device and storage medium
CN111176517A (en) * 2019-12-31 2020-05-19 青岛海尔科技有限公司 Method and device for setting scene and mobile phone
CN111176133A (en) * 2020-02-11 2020-05-19 青岛海信智慧家居***股份有限公司 Method and device for determining smart home scene
CN112180754B (en) * 2020-10-20 2022-03-18 珠海格力电器股份有限公司 Setting method of intelligent control scene and equipment control system
CN113114779B (en) * 2021-04-23 2022-09-02 杭州萤石软件有限公司 Configuration method, terminal and system for linkage of Internet of things equipment

Also Published As

Publication number Publication date
WO2023221995A1 (en) 2023-11-23

Similar Documents

Publication Publication Date Title
JP7254894B2 (en) Connected lighting system
US11438939B2 (en) Discovery of connected devices to determine control capabilities and meta-information
US10158536B2 (en) Systems and methods for interaction with an IoT device
CN107948231B (en) Scene-based service providing method, system and operating system
CN105490897A (en) Household appliance control method and device, as well as mobile terminal
US11782590B2 (en) Scene-operation method, electronic device, and non-transitory computer readable medium
WO2022022121A1 (en) Interactive method for establishing device linkage scene, and storage medium and electronic device
EP3760008B1 (en) Rendering a dynamic light scene based on one or more light settings
CN111123851A (en) Method, device and system for controlling electric equipment according to user emotion
CN109240098B (en) Equipment configuration method and device, terminal equipment and storage medium
WO2023051643A1 (en) Device control method, related apparatus, and communication system
CN106154853A (en) A kind of control method and mobile terminal
EP3959590A1 (en) User interface for audio message
CN117092926B (en) Equipment control method and electronic equipment
CN110794773A (en) Click-type scene creating method and device
TWI738832B (en) Scene-based application operation method, device, terminal equipment and operating system
WO2016123847A1 (en) Application control method and device
CN109783144A (en) Processing method, device and the storage medium of variable in virtual environment interaction realization
CN117130284A (en) Intelligent device control method and electronic device
CN116136659A (en) Intelligent device control method and electronic device
WO2024067308A1 (en) Smart device control method, electronic device, and system
CN116974644A (en) Configuration method of control page, equipment control method and related equipment
KR102426564B1 (en) User interface for audio message
CN117539174A (en) Equipment control method, device, electronic equipment and storage medium
CN115766312A (en) Scene linkage demonstration method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination