GB2578624A - Home automation system - Google Patents

Home automation system Download PDF

Info

Publication number
GB2578624A
GB2578624A GB1817880.6A GB201817880A GB2578624A GB 2578624 A GB2578624 A GB 2578624A GB 201817880 A GB201817880 A GB 201817880A GB 2578624 A GB2578624 A GB 2578624A
Authority
GB
United Kingdom
Prior art keywords
space
user
management module
environment
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1817880.6A
Other versions
GB201817880D0 (en
Inventor
Paul Edwards Jonathan
Edwards David
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1817880.6A priority Critical patent/GB2578624A/en
Publication of GB201817880D0 publication Critical patent/GB201817880D0/en
Publication of GB2578624A publication Critical patent/GB2578624A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2823Reporting information sensed by appliance or service execution status of appliance services in a home automation network
    • H04L12/2827Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality
    • H04L12/2829Reporting to a device within the home network; wherein the reception of the information reported automatically triggers the execution of a home appliance functionality involving user profiles according to which the execution of a home appliance functionality is automatically triggered
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A system for adjusting an environment of a space according to saved preferences of a user. The system comprises: one or more device controllers configured to control one or more devices that can affect the environment; a detection module configured to detect the user's presence in the space; and a management module operative to: record user preferences for parameters of the environment of the space; and on detection of the user's presence in the space 310, retrieve the user preferences for the parameters of the environment of the space 320 and issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences 380. The system may not adjust parameters if another user’s profile is already being implemented in the space 330. The system may detect whether a user is moving from one space into another space 360 and may adjust parameters in said another space.

Description

Intellectual Property Office Application No. GII1817880.6 RTM Date:26 April 2019 The following terms are registered trade marks and should be read as such wherever they occur in this document: Amazon -page 2 Alexa -page 2 Google -page 2 Apple -page 2 Siri -page 2 Amazon echo -page 2 Google home -page 2 HomePod -page 2 WiFi -pages 8, 9 Bluetooth -pages 8, 9, 19, 20 ZigBee -pages 19, 21 Intellectual Property Office is an operating name of the Patent Office www.gov.uk /ipo
HOME AUTOMATION SYSTEM
Field of the Invention
The present disclosure relates to the field of home automation.
Background
In recent years there has been an increase in the development of devices for use within a building, be it a residential or commercial property, which add a layer of automation around everyday tasks such as controlling heating, lighting, ventilation and air conditioning.
The sophistication of building automation systems varies widely, and can depend on numerous factors such as: whether the automation system is installed in a new build or is retrofitted to an existing building; the building's purpose; the number of occupants of the building; and the allocated budget.
These developments in building automation systems come alongside advances in how people interact with audio and video devices in the home. The widespread adoption of streaming services for both audio and video (AV) content demonstrates the extent to which the provision of content on demand has become embedded into people's expectations of how their leisure time can best be met.
Many current AV devices are able to be controlled remotely, and the level of remote control that is possible ranges from a basic level to advanced capabilities. At least some degree of automation control is possible for most devices that can operate in a building.
To provide the capability to control the vast array of different devices that are currently available, device controllers have been developed. These device controllers, which may be co-located within the building or other space, or alternatively may be cloud-based and thus remote from the building or other space, typically provide user-friendly graphical user interfaces that facilitate the configuration, control and automation of devices such as home AV, heating, lighting, ventilation and air conditioning devices.
Alongside these developments have come devices equipped with virtual assistants (VA) such as Amazon's "Alexa", Google's "Assistant" and Apple's "Siri", which can communicate with compatible external devices and device controllers to enable voice control of devices such as heating, lighting and audio-visual equipment. The rapid take-up of such VA equipped devices (e.g. "smart speaker" devices such as the Amazon Echo, Google Home, and Apple HomePod) bears testimony to the receptiveness that people are looking for to control their environment.
Despite developments in building and AV automation systems, to date automation has been largely room-or building-centric. The preferences of individuals present in the room or building are not automatically taken into account.
Summary
According to a first aspect, the invention provides a system for adjusting parameters of an environment of a space according to preferences of a user of the system, the system comprising: one or more device controllers configured to control one or more devices that can affect the environment of the space; a detection module configured to detect the user's presence in the space; and a management module in communication with the one or more device controllers and with the detection module, wherein the management module is operative to: record user preferences for parameters of the environment of the space; and on detection of the user's presence in the space, retrieve the user preferences for the parameters of the environment of the space and issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences.
The management module may be configured to: detect movement of the user from the space to a new space that is different from the space; and if, in response to a prompt, the user indicates that an environment of the new space should be adjusted in accordance with user preferences for the new space: issue commands to one or more device controllers of the new space to adjust parameters of an environment of the new space in accordance with the user preferences for the parameters of the environment of the space.
The management module may be configured to infer the user's egress from the space when it receives notification from the detection module of detection of the presence of the user in the new different space.
The management module may be configured to identity a location of a user based on notifications received from a plurality of detection modules indicating movement of the user through a plurality of detection zones.
The management module may be configured to: on detection of the user's presence in the space: detect whether another user is present in the space and parameters of the environment of the space have been adjusted in accordance with the another user's preferences; and if so, omit to issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences until the management module detects that either the another user has left the space, or detects that the another user has ceded control of the space.
The management module may include a touch screen and a user interface configured to receive programming inputs for user preferences for parameters of the environment in the space.
The management module may be configured to receive programming inputs for user preferences for parameters of the environment in the space from a device external to the management module that communicates with the management module via a communications interface of the management module.
The management module may be configured to identify the user prior to accepting programming inputs.
The management module may be configured to identify the user by: user input of a username and password; recognising a device worn or carried by the user; recognising a dedicated identification device worn or carried by the user; recognising biometric data input by the user via a biometric sensor of the management module.
The management module may be configured to: receive, from the one or more device controllers, information on user adjustments that affect the environment of the space; analyse the received information to identify a pattern of user behaviour; and adjust an existing program of user preferences for the environment of the space or develop a new program of user preferences for the environment of the space based on an identified pattern of user behaviour.
The parameters of the environment of the space may comprise, for example, one or more of: a temperature of the space; a brightness of lighting in the space; a colour of lighting in the space; a source of audio content to be played within the space; a volume of an audio source of audio to be played within the space; a source of video content to be displayed within the space; a volume of audio content associated with video content to be displayed within the space; a setting of a window dressing within the space; a setting of a projection screen within the space; or other parameters affecting the environment within the space.
The detection module may be configured to detect the user's presence in the space by: detecting the presence in the space of an identification device carried or worn by the user or attached to an item that is carried or worn by the user.
The identification device may comprise: an RFID tag; an RFID equipped device; an NFC tag; an NFC equipped device: or a ZigBee equipped device.
The detection module may be configured to detect the user's presence in the space by: detecting the presence in the space of a user device.
The user device may comprise: a mobile telephone; a smartwatch; or a personal monitoring device.
The detection module may be configured to detect the user's presence in the space by: detecting a beacon signal transmitted by a device within the space.
Alternatively or additionally, the detection module may be configured to detect the user's presence in the space by detecting a trigger phrase uttered by the user.
Alternatively or additionally, the detection module may be configured to detect the user's presence in the space by analysing audio data received at a microphone to detect 35 speech or a voice command of a known user of the system.
The detection module may be configured to notify the management module when it detects that the user has left the space.
The one or more device controllers may be located in the space or may be located outside of the space. For example, the one or more device controllers may be cloud-based.
According to a second aspect of the invention there is provided a management module for a system for adjusting parameters of an environment of a space according to preferences of a user of the system, the management module comprising: a processor: a memory configured to store user preferences for parameters of the environment of the space; and one or more communications interfaces configured to permit communication between the management module and one or more detection modules of the system and between the management module and one or more device controllers of the system, wherein the management module is configured to: on receiving an indication that the presence of the user has been detected in the space: retrieve the user preferences for the parameters of the environment of the space; and issue commands to the one or more device controllers of the system to adjust parameters of the environment of the space in accordance with the retrieved user preferences.
The management module may be further configured to: detect movement of the user from the space to a new space that is different from the space; and if, in response to a prompt, the user indicates that an environment of the new space should be adjusted in accordance with user preferences for the new space: issue commands to one or more device controllers of the new space to adjust parameters of an environment of the new space in accordance with the user preferences for the parameters of the environment of the space.
The management module may be configured to infer the user's egress from the space when it receives notification from a detection module of the system of detection of the presence of the user in a different space.
The management module may be configured to identity a location of a user based on notifications received from a plurality of detection modules indicating movement of the user through a plurality of detection zones.
The management module may be configured to: on detection of the user's presence in the space: detect whether another user is present in the space and parameters of the environment of the space have been adjusted in accordance with the another user's preferences; and if so, omit to issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences until the management module detects that either the another user has left the space, or detects that the another user has ceded control of the space.
The management module may include a touch screen and a user interface configured to receive programming inputs for user preferences for parameters of the environment in the space.
The management module may be configured to receive programming inputs for user preferences for parameters of the environment in the space from a device external to the 15 management module that communicates with the management module via a communications interface of the management module.
The management module may be configured to identify the user prior to accepting programming inputs.
The management module may be configured to identify the user by: user input of a username and password; recognising a device worn or carried by the user; recognising a dedicated identification device worn or carried by the user; recognising biometric data input by the user via a biometric sensor of the management module; and/or recognising the voice of the user from an utterance made by the user.
The management module may be configured to: receive, from the one or more device controllers of the system, information on user adjustments that affect the environment of the space; analyse the received information to identify a pattern of user behaviour; and adjust an existing program of user preferences for the environment of the space or develop a new program of user preferences for the environment of the space based on an identified pattern of user behaviour.
The parameters of the environment of the space may comprise one or more of: a temperature of the space; a brightness of lighting in the space; a colour of lighting in the space; a source of audio content to be played within the space; a volume of an audio source of audio to be played within the space; a source of video content to be displayed within the space; a volume of audio content associated with video content to be displayed within the space; a setting of a window dressing within the space; a setting of a projection screen within the space; or other parameters affecting the environment within the space.
Brief Description of the Drawings
Embodiments of the invention will now be described, strictly by way of example only, with reference to the accompanying drawings, of which: Figure 1 is a schematic view of an automation system; Figure 2 is a schematic representation of a management module suitable for use in the automation system of Figure 1; and Figure 3 is a flow diagram illustrating operations performed by the management module of Figure 2.
Detailed Description
The present disclosure describes a system which is able automatically to recognise the presence of an individual in a space such as a room, and adapt the environment (e.g. lighting, temperature, content provided by audio-visual devices, volume of such audio-visual content, state (open/closed/partially open etc.) of window dressings such as curtains and blinds, state (deployed/stowed etc.) of a stowable projection screen as examples) in the space automatically, to accommodate the preferences of the individual. In the following disclosure the term "room" has been used, but it is to be understood that the term "room" in this context refers to any distinct area (e.g. a room in a building, a garage, patio area, outbuilding or the like) whose environment can be controlled by the system.
Referring first to Figure 1, an automation system is shown generally at 100. The automation system 100 includes a detection module 110 configured to detect the presence of a user of the system 100 within a space such as a room or other space in which the system 100 is operational.
The automation system 100 also includes a management module 120 which is communicatively coupled to the detection module 110 and to one or more device controllers 130.
The device controllers 130 are configured to control the operation of one or more devices 140 that are controlled by the automation system 100 in accordance with commands received from the management module 120.
The management module 120 may communicate with the detection module 110 using a wired communication protocol such as Ethernet or some other wired communications protocol, or alternatively may communicate wirelessly with the detection module 110 using, for example, WiFi, Bluetooth or some other wireless communications protocol.
Similarly, the management module 120 may communicate with the device controllers using a wired communication protocol such as Ethernet or some other wired communications protocol, or alternatively may communicate wirelessly with the device controllers 130 using, for example, WiFi, Bluetooth or some other wireless communications protocol.
The management module 120 is configured to record user preferences for the space in which the system 100 is operational. When the user's presence within the space is detected by the detection module 110, the detection module transmits a signal to the management module 120 indicating that detection of the user's presence, and the management module 120 retrieves stored preferences for that user for that space, and (provided that no conflict exists with another user already present within the environment, as will be discussed in more detail below) issues commands to the device controller(s) 130 to control the operation of the device(s) 140 so as to adjust the environment within the space in accordance with the user's preferences.
The management module 120 is further configured to control user access rights to the system 100, to generate user passwords and/or other login information and to record information about user devices that have been detected or have attempted to access the system.
Turning now to Figure 2, an example of a management module suitable for use in the automation system 100 of Figure 1 is shown generally at 120.
The management module 120 in the illustrated example includes one or more processors 210, memory 220, and a communications interface 230 configured to enable bidirectional communication between the management module 120 and the detection module 110 and to enable bidirectional communication between the management module 120 and the device controller(s) 130 of the system 100 of Figure 1.
The communications interface 230 may be configured to operate under a plurality of different communications protocols. For example, the communications interface may be configured to operate under a wired communications protocol such as Ethernet or the like and under one or more wireless communications protocols such as WiFi, Bluetooth or the like, in order to accommodate installations in which communication between the management module 120 and the detection module 110 employs a different communication protocol than communication between the management module and the device controller(s) 130.
The communications interface 230 also enables bidirectional communications between the management module 120 and external devices such as smartphones, tablet or laptop computers and smart devices such as smart speakers that incorporate virtual assistant technologies, to allow the management module 120 to be programmed or controlled by such external devices.
The management module 120 in the illustrated example includes a touch screen 250, on which is presented a user interface (UI) 260 that enables a user to program the management module 120 with user preferences for the environment in each of a plurality of spaces (e.g. rooms in a building) that include devices 140 whose functionality can be controlled by means of the management module 120 and device controllers 130.
The UI 260 therefore includes controls 262, 264, 266, 268, 270 (e.g. graphical representations of buttons presented on the touch screen 250) that permit a user to select a room or other space whose environment can be controlled by the system 100, and controls (e.g. graphical representations of buttons presented on the touch screen 250) for setting environmental parameters within the selected space. For example, the UI 260 may include a lighting control 272 permitting a user to set parameters such as the brightness and colour of the lighting in the selected space. The UI 260 may further include a temperature control 274 permitting the user to set the temperature in the selected space. The UI 260 may further include an audio control 276 permitting the user to set parameters of audio to be played by audio equipment in the selected space, such as a desired source of audio content to be played within the space (e.g. a radio channel, playlist, album or the like), and the volume of such audio content. Similarly, the UI 260 may further include a video control 278 permitting the user to set parameters of video to be played by audio-visual equipment such as a television and associated set-top box in the selected space, such as a desired source of video content (e.g. a television channel, film or the like), and the volume of the audio associated with the video content. The UI 260 may further include a window dressing control 280 permitting the user to adjust settings (e.g. open/closed/partially open etc.) of window dressings such as curtains, blinds and the like in the selected space.
The management module 120 can be programmed, via the UI 260, with a plurality of users' preferences for the environment in each of the plurality of spaces, and these preferences are stored by the management module 120 (e.g. in the memory 220) such that when a user's presence in one of the plurality of spaces is detected by the detection module 110, that user's preferences for the environment in that space can be retrieved by the management module 120 and appropriate commands or control signals can be transmitted to the relevant device controller(s) 130 in the space to control the devices 140 to implement the user's preferences.
In order to assign programmed user preferences for the environment in a particular space to a particular user, the management module 120 is configured to identify and/or authenticate a user who is programming the management module 120. For example, the UI 260 may present a user login screen that must be completed by the user by entering a username and password before the UI will permit the user to enter programming inputs to the UI 260.
Alternatively (or additionally) the management module 120 may be configured to identify and/or authenticate the user by recognising a device such as a mobile telephone, smart phone, smart watch, personal monitoring device such as a fitness tracker or other personal device worn or carried by the user.
Alternatively (or additionally) the management module 120 may be configured to identify and/or authenticate the user by recognising a dedicated identification device such as an RFID tag or RFID equipped device worn or carried by the user or attached to an item that is worn or carried by the user.
Alternatively (or additionally) the management module 120 may be configured to identify and/or authenticate the user by means of one or more biometric sensors such as a fingerprint sensor, iris scanner or the like.
Once the user has been identified (using any of the above described methods), the UI 260 can permit the user to enter preferences, which are stored by the management module 120, for example in the memory 220, as programs of user preferences, also referred to herein as "moods".
As an alternative to programming the management module 120 using the UI 260, a user may program the management module 120 using a device external to the management module 120 that communicates with the management module 120 via the communications interface 230. Thus, in some embodiments the touch screen 250 and UI 260 can be omitted from the management module 120. For example, an application running on a smartphone, tablet or laptop computer communicatively coupled to the management module 120 via the communications interface 230 may be used to program the management module 120. Alternatively, the management module 120 may be programmed using voice commands issued by the user to a virtual assistant operating on a device such as a smart speaker that is communicatively coupled to the management module 120 via the communications interface 230.
Where the management module 120 is programmed by an external device, that external device may perform the function of identifying and/or authenticating the user who is programming the management module. For example, where an application executing on a device such as a smartphone, tablet or laptop computer is being used to program the management module, the smartphone, tablet or laptop computer may identify and/or authenticate the user itself, for example using a username and password login mechanism provided in the application, or by using biometric identification mechanisms such as a fingerprint recognition, iris recognition, face recognition, voice recognition, speaker recognition (i.e. identification of an individual by characteristics of their speech) mechanism or the like that may be provided by the device. Similarly, where the management module 120 is programmed using voice command issued by the user to a virtual assistant, the device on which the virtual assistant operates may identify and/or authenticate the user, for example using a voice password, voice recognition or speaker recognition mechanism of the virtual assistant equipped device.
The management module 120 permits the user to enter moods (either through the UI 260 or through a remote device as described above) for each space whose environment can be controlled by the system 100, and can be programmed with moods for each space for particular times of the day and/or particular days of the week.
As well as permitting the user to pre-configure moods for each space, the management module 120 also allows the user to enter moods on the fly within a space, by using the controls 262, 264, 266, 268, 270 of the UI 260 to adjust directly settings and parameters of the devices 140 within that space. Once the settings and parameters of the devices 140 have been adjusted by the user to meet their preferences, the management module may prompt the user (e.g. via a spoken prompt issued by a virtual assistant or via a prompt presented by an application executing on a user device such as a mobile telephone) to confirm whether the program of preferences should be stored as a mood for that user for that space. The management module 120 may further prompt the user to indicate whether the stored mood is a "static" mood, i.e. a mood that applies to that space only, or a "portable" mood, i.e. a mood that can "follow" the user as they move between different spaces, as will be described in more detail below.
The management module 120 permits a user to view and select any mood that that user has previously stored in the management module 120. Such previously stored moods may include preferred audio playlists, streaming services and other preferences. A user's previously stored room can be accessed by the user via the UI 260 of the management module 120, or via an application executing on a user device such as a mobile telephone, or via spoken commands issued to a virtual assistant, and any previously stored mood can be selected and implemented in the space in which the user is currently located (to the extent possible with the devices 140 that are present in the space), regardless of whether the selected mood was initially stored for a different space.
As will be appreciated by those skilled in the art, in order to achieve the same audio experience in different spaces that may be equipped with different audio output devices or may have different acoustic environments, different settings (e.g. volume settings, equaliser settings etc.) may be required for the audio devices in different spaces. During commissioning of the system 100 any requirement for different audio settings in different spaces are identified and the required settings are programmed into the management module 120 to ensure that as far as possible the audio experience for a particular mood is the same regardless of the space in which that mood is implemented.
As an example, a first user (user 1) may program the management module 120 with first and second moods or programs (program 1, program 2) containing the following preferences for a kitchen: Program 1 User: User 1 Space: Kitchen Days: Monday -Friday Time: 0630-0800 Temperature: 20°C Lighting: Colour white, brightness level 8 Audio: Radio channel no. 1, volume level 5 Video: TV news channel no. 1, volume level 0 (mute) Program 2 User: User 1 Space: Kitchen Days: Saturday, Sunday Time: 0800-0900 Temperature: 20°C Lighting: Colour white, brightness level 8 Audio: Radio channel no. 2, volume level 7 Video: None Similarly, the first user may program the management module with third and fourth moods or programs (program 3, program 4) containing the following preferences for a living room: Program 3 User: User 1 Space: Living Room Days: Monday -Friday Time: 1900-2200 Temperature: 20°C Lighting: Colour white, brightness level 4 Audio: None Video: TV entertainment channel no. 4, volume level 8 Program 4 User: User 1 Space: Living Room Days: Saturday, Sunday Time: 1900-2200 Temperature: 20°C Lighting: Colour white, brightness level 4 Audio: None Video: TV entertainment channel no. 2, volume level 8 A second user (user 2) may program the management module with fifth and sixth moods or programs (program 5, program 6) containing their own preferences for the kitchen, and with seventh and eighth moods or programs (program 7, program 8) containing their own preferences for the living room, as follows: Program 5 User: User 2 Space: Kitchen Days: Monday -Friday Time: 0800-1000 Temperature: 19°C Lighting: Colour white, brightness level 8 Audio: Playlist 1, volume level 8 Video: None Program 6 User: User 2 Space: Kitchen Days: Saturday, Sunday Time: 0800-0900 Temperature: 19°C Lighting: Colour white, brightness level 8 Audio: None Video: TV new channel 1, volume level 5 Program 7 User: User 2 Space: Living Room Days: Monday -Friday Time: 1000-1200 Temperature: 20°C Lighting: Colour white, brightness level 8 Audio: Radio channel no. 2, volume level 7 Video: None Program 8 User: User 2 Space: Living Room Days: Saturday, Sunday Time: 1900-2200 Temperature: 19°C Lighting: Colour white, brightness level 2 Audio: None Video: TV entertainment channel no. 4, volume level 7 As will be appreciated from the above examples, the management module 120 may be programmed with a plurality of moods or programs containing the preferences of different users for each space whose environment can be controlled by the system 100. When the presence of a user in a space is detected by the detection module 110 at a time specified by a program for that space, the management module 120 sends appropriate control signals or commands to the device controller(s) 130 in that space to cause the relevant device(s) 140 to implement the relevant mood or program.
Where a conflict exists between two moods or programs because two or more users are present in a space and each user has a program for that space at that time (e.g. program 2 above conflicts with program 6 above, because both programs specify parameters in the kitchen on Saturday and Sunday between 0800 and 0900), rules of precedence are applied by the management module 120 to resolve the conflict, as will be described in more detail below.
A user may adjust the environment within a space even when the environment in that space is being controlled by the management module 120 in accordance with a user-specified mood or program. For example, the user may manually adjust settings of the device controller(s) 130 within the space, for example to adjust the temperature or lighting, or the operation of audio or video equipment within the space. Alternatively, the user may adjust the settings of one or more devices 140 within a space using an application running on a device such as a smartphone, tablet or laptop computer, or by issuing voice commands to a device equipped with a virtual assistant.
Information on adjustments to the settings of the device(s) 140 within a space may be fed back to the management module 120 via the device controller(s) 130, and the management module 120 may log the adjusted settings, the day and time at which the adjustments were made to the settings, and whether the adjustments were made by the user who initially specified the mood or program that is currently being implemented by the management module 120 (based on information received from the detection module 110 indicating that the user's presence is detected in the space). By logging such changes over a period of time (e.g. a number of days or weeks), the management module 120 can build a database of adjustments to the user's programmed mood or program, and can analyse this database to identify patterns of adjustments. Where a pattern of adjustments is identified (e.g. the user regularly adjusts the audio visual settings to show a particular TV channel at 8.00pm on a Tuesday evening), the management module 120 can suggest a change to the relevant mood or program to the user, via an appropriate prompt, which the user can accept or reject. Alternatively, the management module 120 may implement the change to the relevant mood or program without prompting the user. Thus the management module 120 is able to apply machine learning to identify patterns of user behaviour and adapt the user-specified programs in accordance with the identified patterns of user behaviour.
This machine learning can also be applied by the management module 120 where a user-specified mood or program is not being implemented in order to suggest a new mood or program to the user, based on identified patterns of user behaviour. Thus, the device controller(s) 130 in a particular space can feed back to the management module 120 information on settings applied by a user to the devices 140 in the space. The management module 120 may log details of the space, the applied settings, the day and time at which the settings were applied, and the identity of the user who was present in the space when the settings were applied. By logging such data over a period of time, the management module 120 can build a database of device settings for particular users and particular spaces, and can analyse this database to identify patterns. If a particular pattern of user-applied settings is identified in a particular space (e.g. if a particular user habitually sets the temperature in the kitchen to 20°C and sets an audio device in the kitchen to play a particular playlist on Sundays at 12.00) then the management module 120 can suggest a new mood or program implementing the settings of the identified pattern to the user, via an appropriate prompt, which the user can accept or reject. If the user accepts the new mood or program the management module 120 stores it in the memory 220 such that it can subsequently be implemented when the presence of the user in the space is detected by the detection module 110. Alternatively, the management module 120 may store the new mood or program in memory 220 without prompting the user. Thus the management module 120 is able to apply machine learning to identify patterns of user behaviour and develop new moods or programs in accordance with the identified patterns of user behaviour.
The management module 120 may be configured such that a mood or program of user preferences "follows" the user as they move between a first space and a second space, if no mood or program has been specified for that user for the second space.
For example, if the management module 120 is implementing a stored mood or program while the presence of a user is detected in the living room, when the user leaves the living room and enters the kitchen (as detected by the detection module 110), the management module 120 may apply the user preferences of the mood or program that was being implemented in the living room to the kitchen, insofar as is possible with the devices that are available in the kitchen.
To support this functionality the management module 120 maintains a database of the devices that are available in each space whose environment can be controlled by the system 100. If the management module 120 is unable to implement or fully implement the mood or program of user preferences from one space in another space, the management module 120 may notify the user of this fact, for example by playing a tone or recorded announcement on an audio device of the second space, by displaying a message on a video device such as a television of the second space, by sending a message to a device such as a mobile phone, smart watch or other personal device carried or worn by the user, or by a spoken announcement made by a virtual assistant.
As indicated above, this "follow" functionality may be implemented where the management module 120 does not have a stored mood or program for the user for the second space. Where a mood or program has been specified by the user for the second space, that mood or program will be implemented in the second space. Alternatively, the management module 120 may be configured to offer the user a choice (e.g. via an appropriate prompt sent to the user's mobile telephone or other device, or via prompt spoken by a virtual assistant) between implementing the stored mood or program for the second space and implementing the mood or program for the first space in the second space when the presence of the user is detected in the second space.
A "suspend" function may be provided by the system 100 which, when activated by a user, prevents a mood or program that is active in a space from following the user when that user temporarily leaves that space and enters a different space (e.g. when the user leaves a living room to make a drink in a kitchen). Thus, if a user has the "suspend" function active, the user's mood will continue in the original space and will not automatically follow the user into the different space. The "suspend" function also ensures that if a different user enters the space during the original user's temporary absence, the management module 120 will not implement the different user's mood for that space, but will instead continue to implement the original user's mood.
A timeout mechanism may be provided by the management module 120, such that the management module 120 ceases to implement the original user's mood in the event that the original user does return to the space within a predetermined period of time, which may be specified by the user. This ensures that the original user's suspended mood does not prevent the management module 120 from implementing a different user's mood in the space in the event that the original user does not return to the space within the predetermined period of time.
If the user leaves a space (the living room in the example above) in which other users are present, then the management module 120 will prompt the user, on detection of the user's presence in the different space (the kitchen in the example above) to indicate whether the user wishes the mood or program that was being implemented in the space that the user has just left to be implemented in the different space that the user has just entered. This prompt may be issued by a virtual assistant or by an application executing on a user device such as a mobile telephone.
If the user indicates that the mood is to be implemented in the different space, the management module 120 implements the user's mood in the different space that the user has just entered (the kitchen in the example above), and ceases to implement the user's mood in the space that the user has just left (the living room in the example above), allowing another user's mood to be implemented in the space that the user has just left.
Alternatively, the user may indicate that the mood is to be implemented in the different space that the user has just entered (the kitchen in the example above), but is to remain active in the space that the user has just left (the living room in the example above). In such a case the management module 120 implements the user's mood in the different space that the user has just entered, and continues to implement the user's mood in the space that the user has just left.
As discussed above, the presence of a user in a space is detected by a detection module 110. A detection module 110 may be provided in each space (e.g. each room of a house) whose environment can be controlled by the system 100, and/or in areas such as hallways and landings where detection of user movement between spaces is desirable.
The detection module 110 may be a dedicated standalone device which is configured to detect the presence of a user by detecting a dedicated user identification device such as an RFID (radio frequency identification) tag or RFID equipped device, or NFC (near-field communication) tag or NFC equipped device or ZigBee equipped device carried or worn by the user or attached to an item carried or worn by the user.
Alternatively, the detection module 110 may be configured to detect the presence of a user by detecting the presence of a user device such as a mobile telephone, smartphone, smartwatch, personal monitoring device such as a fitness tracker or the like carried or worn by the user. For example, the detection module 110 may be a Bluetooth device that is paired with the user device and thus is able to detect when the device (and by extension the user) enters the space covered by the detection module 110.
Alternatively, the detection module 110 may be implemented by a device such as a mobile telephone, smart phone, smart watch, personal monitoring device such as a fitness tracker or the like, or by a device such as a smart speaker having virtual assistant functionality.
For example, an application running on a smartphone, smartwatch, personal monitoring device such as a fitness tracker or other personal user device that is specific to a user may cause the device to monitor Bluetooth beacons. On detecting a Bluetooth beacon transmitted by a device within a space such as a room whose environment can be controlled by the system 100, the user device wirelessly transmits (e.g. via WiFi or Bluetooth) a beacon identifier of the detected Bluetooth beacon to the management module 120. As the Bluetooth beacon identifier is unique to the device within the space, the management module 120 is able to determine the location of the user device from the received Bluetooth beacon identifier.
As another example, a virtual assistant device such as a smart speaker assistant can detect the presence of a user in a space such as a room. A trigger phrase may be uttered by the user on entering the space, which trigger phrase is recognised by the virtual assistant device. The virtual assistant device can then transmit a signal to the management module 120 indicating the presence of the user in the space. Alternatively, passive speaker recognition techniques may be used by the virtual assistant device to detect the presence of a user within a space. In such techniques the user need not speak a trigger word, but instead the virtual assistant device continually analyses audio data received at a microphone to determine whether a known user is speaking. On detection of the user by the virtual assistant device on the basis of the received audio, the virtual assistant device transmits a signal to the management module 120 indicating the presence of the user in the space.
When the presence of a user has been detected by the detection module 110 (whether implemented as a dedicated standalone device or otherwise as discussed above), the detection module 110 may cause an audible or visual indication that the user has been recognised to be output, for example via audio and/or video devices that are present in the space in which the user has been detected, or via a personal user device carried or worn by the user such as a mobile telephone, smart phone, smart watch, personal monitoring device such as a fitness tracker or the like.
The egress of a user from a space (e.g. a room) can be inferred by the management module 120 when the user's presence is detected in another space. Alternatively, the detection module 110 may be configured to notify the management module 120 when it detects that the user has left the space, e.g. by detecting the absence of a dedicated user identification device such as an RFID (radio frequency identification) tag, or NFC (near-field communication) tag or ZigBee-enabled device carried or worn by the user or attached to an item carried or worn by the user, by detecting the absence of a Bluetooth beacon signal or by detecting the absence of the user's voice in the space.
As will be appreciated, a user may possess more than one device that can be used by the detection module 110 to detect the presence of the user within a space. For example, a user may possess a smart phone and a smart watch, and may also be provided with a dedicated identification device such as an RFID tag. In order to avoid false activations of the devices within a space by the management module 120 in response to detection of the presence of a user device in the space when the user is not present (e.g. if the user has left their smartwatch in a bedroom but has moved to a kitchen or living room), the management module 120 may be configured to identify the location of the user based on detection of movement of a user device such as a mobile telephone through multiple detection zones or points. In the example above the user may move, with their mobile telephone, out of the bedroom, through a landing area and stairway area before arriving in the kitchen or living room. Each of these different areas may be provided with a different detection module 110, and the management module 120 may identify the presence of the user in the kitchen or living room on the basis of signals received from the different detection modules 110 as the user moves from the bedroom to the kitchen.
It will be understood that there is potential for conflict between the mood or program of preferences of a first user who is already present in a space such as a room and the mood or program of a second user entering the space, in that the management module 120 may detect the presence of the second user in the space, and may seek to implement the second user's mood or program. In such circumstances the management module 120 may apply rules of precedence. Under such rules, the mood or program of preferences of the first user will usually be maintained until the first user either leaves the space or cedes control of the space to the second user by issuing a command to the management module 120 to suspend implementation of their mood or program of preferences (e.g. by speaking a command to a virtual assistant or by issuing a command via an application running on their phone), at which point the management module will cease implementation of the first user's mood or program of preferences and implement the second user's mood or program of preferences.
If the first user enters a new space where no user-specified mood or program is active, then the management module 120 may implement the first user's mood or program of preferences in the new space, insofar as that is possible with the devices present in the new space. In other words, the first user's mood or program of preferences may follow the first user to a new space.
As will be appreciated, it may be desirable for the owner or operator of the system to permit visitors to use the functionality of the system 100, or a subset of that functionality (e.g. a set of simple moods or programs to switch on a bedroom audio-visual system and to turn on a lighting pathway when moving through the building at night). In order to grant visitors access to the system's functionality, the owner or operator of the system 100 may send an invitation (e.g. a web link) to visitors to allow the visitors to enrol their devices in the system temporarily, for period of time specified by the owner or operator of the system 100.
Figure 3 is a flowchart illustrating operations performed by the management module 120 during operation of the system 100.
In a first operation 310, the management module 310 receives a notification from the detection module 110 that the presence of a user has been detected in a space (e.g. a room) covered by the system.
The management module 120 then checks (step 320) whether there is a stored user-specified mood or program for the detected user for the space in which the user has been detected. If so the management module 120 checks, at step 330, whether another user's mood or program is currently being implemented in the space.
If no other user's mood or program is currently being implemented in the space then the management module 120, after establishing by means of a dialogue with the user (e.g. via a virtual assistant or prompts issued by an application running on a user device such as a mobile telephone) whether the user wishes to implement their stored user-specific mood or program in the space, sends appropriate commands to the relevant device controllers 130 in the space to implement the user's mood or program in the space, at step 340. On the other hand, if the management module 120 finds that another user's mood or program is already being implemented in the space, then (step 350) no commands are transmitted to the device controllers 130 in the space so that the other user's existing mood or program continues to be implemented in the space.
Returning to step 320, if the management module finds that there is no stored user-specified mood or program for the detected user for the space in which the user has been detected, then the management module 120 checks (step 360) whether the user is moving into the space from another space in which a mood or program for that user is active. If not, the management module 120 moves to step 350 and no commands are transmitted to the device controllers 130 in the space by the management module 120, and so no mood or program is implemented in the space.
If the management module 120 finds that the user is moving into the space (referred to below as the "new space") from another space (referred to below as the "previous space") in which a mood or program for that user is active, then the management module 120 may prompt the user (e.g. by causing the user's mobile telephone to display an appropriate prompt, or by causing a voice assistant to output a spoken prompt) to indicate whether they wish to implement the mood or program from the previous space in the new space. If the user's response is yes, then at step 380 the management module 120 transmits appropriate commands to the device controllers in the new space to implement the mood or program from the previous space in the new space. On the other hand, if the user's response is no, the management module (at step 350) transmits no commands to the device controllers 130 in the new space.
It is to be appreciated that the step 370 of prompting the user to indicate whether they 30 wish to implement the mood or program from the previous space in the new space is optional, and the management module 120 may instead simply implement the mood or program from the previous space in the new space without prompting the user.
Further, the management module 120 may apply intermediate steps similar to steps 360 and 370 between steps 330 and 340. Thus, after determining at step 330 that another user's mood or program is not being implemented in the space, the management module may check whether the user is moving into the space (referred to below as the "new space") from another space (referred to below as the "previous space") in which a mood or program for that user is active, and if so, may prompt the user to choose between implementing the mood or program from the previous space in the new space, and implementing the stored user-specified mood or program for the new space in the new space. Then if the user chooses to implement the stored user-specified mood or program for the new space in the new space, the management module 120 performs step 340 and transmits appropriate command to the device controllers 130 in the new space to implement the user-specified program or mood in that new space.
As will be appreciated from the foregoing discussion, the system described herein provides a flexible and adaptable automation system that can automatically adapt the environment in a space such as a room, garage, patio, shed, outbuilding or other space covered by the system to meet a user's preferences on detection of the user's presence in the space. Further, the system allows user programs or moods to follow the user as they move between spaces, and provides a mechanism for resolving conflict between a user entering a space and another user already occupying the space.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The word "comprising" does not exclude the presence of elements or steps other than those listed in a claim, "a" or "an" does not exclude a plurality. Any reference signs in the claims shall not be construed so as to limit their scope.

Claims (31)

  1. CLAIMS1. A system for adjusting parameters of an environment of a space according to preferences of a user of the system, the system comprising: one or more device controllers configured to control one or more devices that can affect the environment of the space; a detection module configured to detect the user's presence in the space; and a management module in communication with the one or more device controllers and with the detection module, wherein the management module is operative to: record user preferences for parameters of the environment of the space; and on detection of the user's presence in the space, retrieve the user preferences for the parameters of the environment of the space and issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences.
  2. 2. A system according to claim 1 wherein the management module is configured to: detect movement of the user from the space to a new space that is different from the space; and if, in response to a prompt, the user indicates that an environment of the new space should be adjusted in accordance with user preferences for the new space: issue commands to one or more device controllers of the new space to adjust parameters of an environment of the new space in accordance with the user preferences for the parameters of the environment of the space.
  3. 3. A system according to claim 2 wherein the management module is configured to infer the user's egress from the space when it receives notification from the detection module of detection of the presence of the user in the new different space.
  4. 4. A system according to any one of the preceding claims wherein the management module is configured to identity a location of a user based on notifications received from a plurality of detection modules indicating movement of the user through a plurality of detection zones.
  5. 5. A system according to any one of the preceding claims wherein the management module is configured to: on detection of the user's presence in the space: detect whether another user is present in the space and parameters of the environment of the space have been adjusted in accordance with the another user's preferences; and if so, omit to issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences until the management module detects that either the another user has left the space, or detects that the another user has ceded control of the space.
  6. 6. A system according to any one of the preceding claims wherein the management module includes a touch screen and a user interface configured to receive programming inputs for user preferences for parameters of the environment in the space.
  7. 7. A system according to any one of the preceding claims wherein the management module is configured to receive programming inputs for user preferences for parameters of the environment in the space from a device external to the management module that communicates with the management module via a communications interface of the management module.
  8. 8. A system according to claim 6 or claim 7 wherein the management module is configured to identify the user prior to accepting programming inputs.
  9. 9. A system according to claim 8 wherein the management module is configured to identify the user by: user input of a username and password; recognising a device worn or carried by the user; recognising a dedicated identification device worn or carried by the user; recognising biometric data input by the user via a biometric sensor of the management module..
  10. 10. A system according to any one of the preceding claims wherein the management module is configured to: receive, from the one or more device controllers, information on user adjustments that affect the environment of the space; analyse the received information to identify a pattern of user behaviour; and adjust an existing program of user preferences for the environment of the space or develop a new program of user preferences for the environment of the space based on an identified pattern of user behaviour.
  11. 11. A system according to any one of the preceding claims wherein the parameters of the environment of the space comprise one or more of: a temperature of the space; a brightness of lighting in the space; a colour of lighting in the space; a source of audio content to be played within the space; a volume of an audio source of audio to be played within the space; a source of video content to be displayed within the space; a volume of audio content associated with video content to be displayed within the space; a setting of a window dressing within the space; a setting of a projection screen within the space; or other parameters affecting the environment within the space.
  12. 12. A system according to any one of the preceding claims wherein the detection module is configured to detect the user's presence in the space by: detecting the presence in the space of an identification device carried or worn by the user or attached to an item that is carried or worn by the user.
  13. 13. A system according to claim 12 wherein the identification device comprises: an RFID tag; an RFID equipped device; an NFC tag; an NFC equipped device: or a ZigBee equipped device.
  14. 14. A system according to any one of the preceding claims wherein the detection module is configured to detect the user's presence in the space by: detecting the presence in the space of a user device.
  15. 15. A system according to claim 14 wherein the user device comprises: a mobile telephone; a smartwatch; or a personal monitoring device.
  16. 16. A system according to any one of the preceding claims wherein the detection module is configured to detect the user's presence in the space by: detecting a beacon signal transmitted by a device within the space.
  17. 17. A system according to any one of the preceding claims wherein the detection module is configured to detect the user's presence in the space by detecting a trigger phrase uttered by the user.
  18. 18. A system according to any one of the preceding claims wherein the detection module is configured to detect the user's presence in the space by analysing audio data received at a microphone to detect speech or a voice command of a known user of the system.
  19. 19. A system according to any one of the preceding claims wherein the detection module is configured to notify the management module when it detects that the user has left the space.
  20. 20. A system according to any one of the preceding claims wherein the one or more device controllers are located in the space or are cloud-based and located outside of the space.
  21. 21. A management module for a system for adjusting parameters of an environment of a space according to preferences of a user of the system, the management module comprising: a processor: a memory configured to store user preferences for parameters of the environment of the space; and one or more communications interfaces configured to permit communication between the management module and one or more detection modules of the system and between the management module and one or more device controllers of the system, wherein the management module is configured to: on receiving an indication that the presence of the user has been detected in the space: retrieve the user preferences for the parameters of the environment of the space; and issue commands to the one or more device controllers of the system to adjust parameters of the environment of the space in accordance with the retrieved user preferences.
  22. 22. A management module according to claim 21 wherein the management module is further configured to: detect movement of the user from the space to a new space that is different from the space; and if, in response to a prompt, the user indicates that an environment of the new space should be adjusted in accordance with user preferences for the new space: issue commands to one or more device controllers of the new space to adjust parameters of an environment of the new space in accordance with the user preferences for the parameters of the environment of the space.
  23. 23. A management module according to claim 22 wherein the management module is configured to infer the user's egress from the space when it receives notification from a detection module of the system of detection of the presence of the user in a different 20 space.
  24. 24. A management module according to any one of claims 21 -23 wherein the management module is configured to identity a location of a user based on notifications received from a plurality of detection modules indicating movement of the user through a plurality of detection zones.
  25. 25. A management module according to any one of claims 21 -24 wherein the management module is configured to: on detection of the user's presence in the space: detect whether another user is present in the space and parameters of the environment of the space have been adjusted in accordance with the another user's preferences; and if so, omit to issue commands to the one or more device controllers to adjust parameters of the environment of the space in accordance with the retrieved user preferences until the management module detects that either the another user has left the space, or detects that the another user has ceded control of the space.
  26. 26. A management module according to any one of claims 21-25 wherein the management module includes a touch screen and a user interface configured to receive programming inputs for user preferences for parameters of the environment in the space. 5
  27. 27. A management module according to any one of claims 21-26 wherein the management module is configured to receive programming inputs for user preferences for parameters of the environment in the space from a device external to the management module that communicates with the management module via a communications interface of the management module.
  28. 28. A management module according to claim 26 or claim 27 wherein the management module is configured to identify the user prior to accepting programming inputs.
  29. 29. A management module according to claim 28 wherein the management module is configured to identify the user by: user input of a username and password; recognising a device worn or carried by the user; recognising a dedicated identification device worn or carried by the user; recognising biometric data input by the user via a biometric sensor of the management module; and/or recognising the voice of the user from an utterance made by the user.
  30. 30. A management module according to any one of claims 21-29 wherein the management module is configured to: receive, from the one or more device controllers of the system, information on user adjustments that affect the environment of the space; analyse the received information to identify a pattern of user behaviour; and adjust an existing program of user preferences for the environment of the space or develop a new program of user preferences for the environment of the space based on an identified pattern of user behaviour.
  31. 31. A management module according to any one of claims 21-30 wherein the parameters of the environment of the space comprise one or more of: a temperature of the space; a brightness of lighting in the space; a colour of lighting in the space; a source of audio content to be played within the space; a volume of an audio source of audio to be played within the space; a source of video content to be displayed within the space; a volume of audio content associated with video content to be displayed within the space; a setting of a window dressing within the space; a setting of a projection screen within the space; or other parameters affecting the environment within the space.
GB1817880.6A 2018-11-01 2018-11-01 Home automation system Withdrawn GB2578624A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1817880.6A GB2578624A (en) 2018-11-01 2018-11-01 Home automation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1817880.6A GB2578624A (en) 2018-11-01 2018-11-01 Home automation system

Publications (2)

Publication Number Publication Date
GB201817880D0 GB201817880D0 (en) 2018-12-19
GB2578624A true GB2578624A (en) 2020-05-20

Family

ID=64655379

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1817880.6A Withdrawn GB2578624A (en) 2018-11-01 2018-11-01 Home automation system

Country Status (1)

Country Link
GB (1) GB2578624A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120158203A1 (en) * 2010-12-17 2012-06-21 Crestron Electronics, Inc. Personal Energy Management System
US20160132030A1 (en) * 2013-11-15 2016-05-12 Apple Inc. Aggregating user routines in an automated environment
US20180047230A1 (en) * 2014-04-25 2018-02-15 Vivint, Inc. Automatic system access using facial recognition
US20180139067A1 (en) * 2016-11-12 2018-05-17 Fujitsu Limited Persona-based service delivery
US20180219698A1 (en) * 2017-02-01 2018-08-02 Leigh M. Rothschild System and method of organizing an activity based on user preferences
US20180248972A1 (en) * 2017-02-24 2018-08-30 Samsung Electronics Co., Ltd System and method for automated personalization of an environment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120158203A1 (en) * 2010-12-17 2012-06-21 Crestron Electronics, Inc. Personal Energy Management System
US20160132030A1 (en) * 2013-11-15 2016-05-12 Apple Inc. Aggregating user routines in an automated environment
US20180047230A1 (en) * 2014-04-25 2018-02-15 Vivint, Inc. Automatic system access using facial recognition
US20180139067A1 (en) * 2016-11-12 2018-05-17 Fujitsu Limited Persona-based service delivery
US20180219698A1 (en) * 2017-02-01 2018-08-02 Leigh M. Rothschild System and method of organizing an activity based on user preferences
US20180248972A1 (en) * 2017-02-24 2018-08-30 Samsung Electronics Co., Ltd System and method for automated personalization of an environment

Also Published As

Publication number Publication date
GB201817880D0 (en) 2018-12-19

Similar Documents

Publication Publication Date Title
US11024311B2 (en) Device leadership negotiation among voice interface devices
US11527249B2 (en) Multi-user personalization at a voice interface device
US10748552B2 (en) Noise mitigation for a voice interface device
US11830333B2 (en) Systems, methods, and devices for activity monitoring via a home assistant
US11527246B2 (en) Focus session at a voice interface device
CA3155437A1 (en) Audio-based load control system
GB2578624A (en) Home automation system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)